Sep 30 17:01:37 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 17:01:38 crc restorecon[4744]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:01:38 crc restorecon[4744]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 17:01:39 crc kubenswrapper[4772]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:01:39 crc kubenswrapper[4772]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 17:01:39 crc kubenswrapper[4772]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:01:39 crc kubenswrapper[4772]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:01:39 crc kubenswrapper[4772]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 17:01:39 crc kubenswrapper[4772]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.569430 4772 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576118 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576164 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576173 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576182 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576192 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576201 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576210 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576218 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576224 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576233 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576240 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576247 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576254 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576261 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576268 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576275 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576282 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576289 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576304 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576312 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576319 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576326 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576333 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576340 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576347 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576355 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576362 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576370 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576381 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576394 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576403 4772 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576412 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576422 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576430 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576439 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576449 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576456 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576464 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576477 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576488 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576498 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576508 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576516 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576524 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576532 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576540 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576548 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576556 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576563 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576570 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576577 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576585 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576592 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576598 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576605 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576612 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576619 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576626 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576635 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576643 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576652 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576660 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576667 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576676 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576682 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576689 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576701 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576723 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576731 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576739 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.576747 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577833 4772 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577858 4772 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577877 4772 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577889 4772 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577902 4772 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577910 4772 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577923 4772 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577935 4772 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577944 4772 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577953 4772 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577963 4772 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577973 4772 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577982 4772 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.577991 4772 flags.go:64] FLAG: --cgroup-root="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578000 4772 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578009 4772 flags.go:64] FLAG: --client-ca-file="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578018 4772 flags.go:64] FLAG: --cloud-config="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578026 4772 flags.go:64] FLAG: --cloud-provider="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578035 4772 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578046 4772 flags.go:64] FLAG: --cluster-domain="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578093 4772 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578103 4772 flags.go:64] FLAG: --config-dir="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578111 4772 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578121 4772 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578132 4772 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578141 4772 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578150 4772 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578159 4772 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578168 4772 flags.go:64] FLAG: --contention-profiling="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578176 4772 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578185 4772 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578195 4772 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578204 4772 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578215 4772 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578224 4772 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578232 4772 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578241 4772 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578249 4772 flags.go:64] FLAG: --enable-server="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578258 4772 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578271 4772 flags.go:64] FLAG: --event-burst="100" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578281 4772 flags.go:64] FLAG: --event-qps="50" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578289 4772 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578297 4772 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578306 4772 flags.go:64] FLAG: --eviction-hard="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578316 4772 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578325 4772 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578332 4772 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578341 4772 flags.go:64] FLAG: --eviction-soft="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578350 4772 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578357 4772 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578366 4772 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578374 4772 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578382 4772 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578390 4772 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578399 4772 flags.go:64] FLAG: --feature-gates="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578409 4772 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578417 4772 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578426 4772 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578434 4772 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578443 4772 flags.go:64] FLAG: --healthz-port="10248" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578452 4772 flags.go:64] FLAG: --help="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578461 4772 flags.go:64] FLAG: --hostname-override="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578469 4772 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578479 4772 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578488 4772 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578496 4772 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578504 4772 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578513 4772 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578520 4772 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578529 4772 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578537 4772 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578547 4772 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578557 4772 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578567 4772 flags.go:64] FLAG: --kube-reserved="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578576 4772 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578583 4772 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578607 4772 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578615 4772 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578624 4772 flags.go:64] FLAG: --lock-file="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578632 4772 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578641 4772 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578649 4772 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578663 4772 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578671 4772 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578680 4772 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578689 4772 flags.go:64] FLAG: --logging-format="text" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578697 4772 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578708 4772 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578717 4772 flags.go:64] FLAG: --manifest-url="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578727 4772 flags.go:64] FLAG: --manifest-url-header="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578739 4772 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578750 4772 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578764 4772 flags.go:64] FLAG: --max-pods="110" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578773 4772 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578782 4772 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578791 4772 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578800 4772 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578808 4772 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578817 4772 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578825 4772 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578847 4772 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578855 4772 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578864 4772 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578871 4772 flags.go:64] FLAG: --pod-cidr="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578879 4772 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578895 4772 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578903 4772 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578911 4772 flags.go:64] FLAG: --pods-per-core="0" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578919 4772 flags.go:64] FLAG: --port="10250" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578928 4772 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578936 4772 flags.go:64] FLAG: --provider-id="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578944 4772 flags.go:64] FLAG: --qos-reserved="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578955 4772 flags.go:64] FLAG: --read-only-port="10255" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578963 4772 flags.go:64] FLAG: --register-node="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578972 4772 flags.go:64] FLAG: --register-schedulable="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578981 4772 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.578996 4772 flags.go:64] FLAG: --registry-burst="10" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579005 4772 flags.go:64] FLAG: --registry-qps="5" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579013 4772 flags.go:64] FLAG: --reserved-cpus="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579021 4772 flags.go:64] FLAG: --reserved-memory="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579037 4772 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579046 4772 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579076 4772 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579086 4772 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579094 4772 flags.go:64] FLAG: --runonce="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579103 4772 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579112 4772 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579121 4772 flags.go:64] FLAG: --seccomp-default="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579135 4772 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579144 4772 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579153 4772 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579162 4772 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579171 4772 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579179 4772 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579187 4772 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579195 4772 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579204 4772 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579213 4772 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579221 4772 flags.go:64] FLAG: --system-cgroups="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579229 4772 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579243 4772 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579252 4772 flags.go:64] FLAG: --tls-cert-file="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579260 4772 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579276 4772 flags.go:64] FLAG: --tls-min-version="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579285 4772 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579293 4772 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579301 4772 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579310 4772 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579321 4772 flags.go:64] FLAG: --v="2" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579333 4772 flags.go:64] FLAG: --version="false" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579345 4772 flags.go:64] FLAG: --vmodule="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579355 4772 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.579365 4772 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579587 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579597 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579605 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579613 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579620 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579626 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579633 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579644 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579651 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579658 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579665 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579673 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579681 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579689 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579698 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579705 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579714 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579725 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579734 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579743 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579753 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579761 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579768 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579775 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579783 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579792 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579799 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579805 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579812 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579819 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579825 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579833 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579840 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579846 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579852 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579859 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579865 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579871 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579878 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579887 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579893 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579900 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579906 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579912 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579918 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579925 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579931 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579938 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579945 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579952 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579959 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579966 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579973 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579980 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579986 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.579993 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580002 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580009 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580016 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580022 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580030 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580037 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580044 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580084 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580095 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580104 4772 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580112 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580122 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580131 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580140 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.580148 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.580176 4772 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.593816 4772 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.593873 4772 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594093 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594117 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594131 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594141 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594151 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594161 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594172 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594185 4772 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594196 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594206 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594216 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594226 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594236 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594246 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594255 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594265 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594276 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594286 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594296 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594308 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594317 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594327 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594337 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594348 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594358 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594369 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594379 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594390 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594399 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594409 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594419 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594433 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594446 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594456 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594488 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594500 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594509 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594520 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594530 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594539 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594550 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594560 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594570 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594580 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594591 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594600 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594610 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594621 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594631 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594641 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594652 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594662 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594672 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594686 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594700 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594712 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594726 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594752 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594763 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594774 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594786 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594796 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594806 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594818 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594829 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594839 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594852 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594868 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594879 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594891 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.594918 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.594936 4772 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595408 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595442 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595454 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595493 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595505 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595517 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595529 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595540 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595576 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595590 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595606 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595617 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595628 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595639 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595652 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595664 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595677 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595690 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595701 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595712 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595722 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595732 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595742 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595752 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595762 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595771 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595781 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595792 4772 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595803 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595813 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595823 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595832 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595844 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595854 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595881 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595892 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595902 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595913 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595924 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595933 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595943 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595953 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595963 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595974 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595984 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.595994 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596004 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596013 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596024 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596034 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596047 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596094 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596105 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596115 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596125 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596135 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596144 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596153 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596163 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596173 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596182 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596191 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596200 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596210 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596221 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596230 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596241 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596250 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596260 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596270 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.596301 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.596318 4772 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.598710 4772 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.607170 4772 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.607378 4772 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.610592 4772 server.go:997] "Starting client certificate rotation" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.610646 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.610966 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-05 16:42:53.715029006 +0000 UTC Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.611093 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2327h41m14.103979238s for next certificate rotation Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.651625 4772 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.655131 4772 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.681234 4772 log.go:25] "Validated CRI v1 runtime API" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.731037 4772 log.go:25] "Validated CRI v1 image API" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.734576 4772 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.745394 4772 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-14-22-31-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.745450 4772 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.779620 4772 manager.go:217] Machine: {Timestamp:2025-09-30 17:01:39.773406791 +0000 UTC m=+0.680419702 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0dcd8a16-1277-4116-9b8a-7e3bf2155fd4 BootID:8cd548ba-29ed-4d2b-b59b-8b79e6073d1d Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ac:33:2b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ac:33:2b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b9:5e:f6 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c3:3c:0e Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:cf:cc:cc Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7b:29:0d Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:c1:db:1b Speed:-1 Mtu:1496} {Name:eth10 MacAddress:8e:43:39:1a:06:01 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:f2:4e:9d:8a:69:7a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.780883 4772 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.781254 4772 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.784909 4772 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.785517 4772 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.785703 4772 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.786268 4772 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.786389 4772 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.787165 4772 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.787374 4772 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.787778 4772 state_mem.go:36] "Initialized new in-memory state store" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.788040 4772 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.792981 4772 kubelet.go:418] "Attempting to sync node with API server" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.793204 4772 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.793405 4772 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.793560 4772 kubelet.go:324] "Adding apiserver pod source" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.793698 4772 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.800491 4772 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.802508 4772 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.803657 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:39 crc kubenswrapper[4772]: E0930 17:01:39.803986 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.804002 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:39 crc kubenswrapper[4772]: E0930 17:01:39.805100 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.817034 4772 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819418 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819463 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819478 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819492 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819515 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819529 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819543 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819564 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819580 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819596 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819617 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819630 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.819654 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.820483 4772 server.go:1280] "Started kubelet" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.822319 4772 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.822389 4772 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.822618 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:39 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.823096 4772 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.823392 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.823425 4772 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.823496 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 09:37:26.143831798 +0000 UTC Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.823571 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1648h35m46.320266699s for next certificate rotation Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.823606 4772 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.823624 4772 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 17:01:39 crc kubenswrapper[4772]: E0930 17:01:39.823881 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.824294 4772 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.824512 4772 server.go:460] "Adding debug handlers to kubelet server" Sep 30 17:01:39 crc kubenswrapper[4772]: E0930 17:01:39.824699 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="200ms" Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.824940 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.825263 4772 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.825284 4772 factory.go:55] Registering systemd factory Sep 30 17:01:39 crc kubenswrapper[4772]: E0930 17:01:39.825275 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.825296 4772 factory.go:221] Registration of the systemd container factory successfully Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.825761 4772 factory.go:153] Registering CRI-O factory Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.825781 4772 factory.go:221] Registration of the crio container factory successfully Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.825805 4772 factory.go:103] Registering Raw factory Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.825824 4772 manager.go:1196] Started watching for new ooms in manager Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.826586 4772 manager.go:319] Starting recovery of all containers Sep 30 17:01:39 crc kubenswrapper[4772]: E0930 17:01:39.843501 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.115:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a1e199e4c8e20 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 17:01:39.820432928 +0000 UTC m=+0.727445799,LastTimestamp:2025-09-30 17:01:39.820432928 +0000 UTC m=+0.727445799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849631 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849697 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849733 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849753 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849783 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849801 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849816 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849840 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849860 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849884 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849902 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849918 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849941 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849967 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849981 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.849996 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850018 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850036 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850080 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850099 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850117 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850151 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850168 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850191 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850237 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850251 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850274 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850295 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850312 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850331 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850346 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850360 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850379 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850395 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850413 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850431 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850445 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850465 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850482 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850500 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850515 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850531 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850551 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850564 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850577 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850594 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850606 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850681 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850700 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850714 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850787 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.850804 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852343 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852379 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852392 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852409 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852426 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852443 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852456 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852471 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852483 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852499 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.852538 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854142 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854172 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854220 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854233 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854248 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854262 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854274 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854286 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854351 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854366 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854380 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854393 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854410 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854423 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854437 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854451 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854465 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854480 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854496 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854512 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854525 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854538 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854550 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854561 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854573 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854584 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854603 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854617 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854628 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854639 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854650 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854663 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854675 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854686 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854698 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854708 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854719 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854732 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854744 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854756 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854768 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854801 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854816 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854831 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854844 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854859 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854873 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854888 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854906 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854921 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854936 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854951 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854965 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854980 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.854992 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.855004 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.855019 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.855032 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857117 4772 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857165 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857180 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857195 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857208 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857221 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857235 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857248 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857260 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857273 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857285 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857296 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857356 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857371 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857382 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857393 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857406 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857420 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857432 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857447 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857459 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857470 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857481 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857491 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857503 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857515 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857528 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857539 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857550 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857561 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857573 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857583 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857595 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857610 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857620 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857632 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857644 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857654 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857667 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857678 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857691 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857703 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857715 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857726 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857738 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857748 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857760 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857771 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857782 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857791 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857802 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857814 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857826 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857837 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857848 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857859 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857871 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857882 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857894 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857905 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857915 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857928 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857939 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857950 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857960 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857970 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857981 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.857992 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858004 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858013 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858024 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858034 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858045 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858089 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858103 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858116 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858142 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858189 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858201 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858213 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858226 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858238 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858251 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858263 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858275 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858286 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858303 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858313 4772 reconstruct.go:97] "Volume reconstruction finished" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.858321 4772 reconciler.go:26] "Reconciler: start to sync state" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.874045 4772 manager.go:324] Recovery completed Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.886523 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.888865 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.888919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.888945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.890487 4772 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.890511 4772 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.890532 4772 state_mem.go:36] "Initialized new in-memory state store" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.894596 4772 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.896776 4772 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.896837 4772 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.896894 4772 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 17:01:39 crc kubenswrapper[4772]: E0930 17:01:39.896951 4772 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 17:01:39 crc kubenswrapper[4772]: W0930 17:01:39.899510 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:39 crc kubenswrapper[4772]: E0930 17:01:39.899598 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:39 crc kubenswrapper[4772]: E0930 17:01:39.924486 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.924845 4772 policy_none.go:49] "None policy: Start" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.926118 4772 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.926155 4772 state_mem.go:35] "Initializing new in-memory state store" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.979697 4772 manager.go:334] "Starting Device Plugin manager" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.980034 4772 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.980083 4772 server.go:79] "Starting device plugin registration server" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.980647 4772 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.980670 4772 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.980873 4772 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.981032 4772 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.981046 4772 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 17:01:39 crc kubenswrapper[4772]: E0930 17:01:39.996159 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.997323 4772 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.997424 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.998801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.998845 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.998858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.999034 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.999502 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:01:39 crc kubenswrapper[4772]: I0930 17:01:39.999603 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.000329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.000374 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.000389 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.000824 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.001012 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.001075 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.001087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.001150 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.001164 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.001947 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.001976 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.001985 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.002116 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.002134 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.002144 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.002189 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.002352 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.002408 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.003514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.003540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.003554 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.003584 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.003603 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.003614 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.003712 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.003916 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.003978 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.004753 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.004801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.004816 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.005049 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.005095 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.005138 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.005163 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.005172 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.005898 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.005939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.005951 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: E0930 17:01:40.026326 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="400ms" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.060573 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.060642 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.060675 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.060900 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061000 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061036 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061316 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061503 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061562 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061580 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061628 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061695 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061721 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061778 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.061830 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.081389 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.082678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.082717 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.082728 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.082755 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:01:40 crc kubenswrapper[4772]: E0930 17:01:40.083440 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.115:6443: connect: connection refused" node="crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.163901 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164120 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164208 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164216 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164332 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164366 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164446 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164407 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164501 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164596 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164663 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164686 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164824 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164932 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164942 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165040 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165136 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165149 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165346 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165479 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165044 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165365 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.164871 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165188 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165541 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165743 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165790 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165874 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165978 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.165881 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.284097 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.286247 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.286402 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.286486 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.286864 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:01:40 crc kubenswrapper[4772]: E0930 17:01:40.287908 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.115:6443: connect: connection refused" node="crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.341846 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.352209 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.378901 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.396015 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.399276 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:01:40 crc kubenswrapper[4772]: W0930 17:01:40.416758 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cc19825351ac6c0dbce77bdea6af78ad68c0ec217eeea07a557fb93e05147505 WatchSource:0}: Error finding container cc19825351ac6c0dbce77bdea6af78ad68c0ec217eeea07a557fb93e05147505: Status 404 returned error can't find the container with id cc19825351ac6c0dbce77bdea6af78ad68c0ec217eeea07a557fb93e05147505 Sep 30 17:01:40 crc kubenswrapper[4772]: W0930 17:01:40.419463 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-05446ce331e90ec62b1e99e21cb11d1c01835fe2e312c02af175091428e1417a WatchSource:0}: Error finding container 05446ce331e90ec62b1e99e21cb11d1c01835fe2e312c02af175091428e1417a: Status 404 returned error can't find the container with id 05446ce331e90ec62b1e99e21cb11d1c01835fe2e312c02af175091428e1417a Sep 30 17:01:40 crc kubenswrapper[4772]: E0930 17:01:40.426876 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="800ms" Sep 30 17:01:40 crc kubenswrapper[4772]: W0930 17:01:40.427173 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-1dbee9ccd2e8744463706d8366ffe1500ba197de27d350c24df31fedd174cb0e WatchSource:0}: Error finding container 1dbee9ccd2e8744463706d8366ffe1500ba197de27d350c24df31fedd174cb0e: Status 404 returned error can't find the container with id 1dbee9ccd2e8744463706d8366ffe1500ba197de27d350c24df31fedd174cb0e Sep 30 17:01:40 crc kubenswrapper[4772]: W0930 17:01:40.434316 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-49de239ad5a39f8fe97bd982f496c8559d5f124b3190368f5a855ff6a84e913e WatchSource:0}: Error finding container 49de239ad5a39f8fe97bd982f496c8559d5f124b3190368f5a855ff6a84e913e: Status 404 returned error can't find the container with id 49de239ad5a39f8fe97bd982f496c8559d5f124b3190368f5a855ff6a84e913e Sep 30 17:01:40 crc kubenswrapper[4772]: W0930 17:01:40.436113 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-45c7447c279132768c35ed1d542aac7c77c9104c6df11787a799543e7303055d WatchSource:0}: Error finding container 45c7447c279132768c35ed1d542aac7c77c9104c6df11787a799543e7303055d: Status 404 returned error can't find the container with id 45c7447c279132768c35ed1d542aac7c77c9104c6df11787a799543e7303055d Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.688963 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.691457 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.691557 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.691577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.691623 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:01:40 crc kubenswrapper[4772]: E0930 17:01:40.692515 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.115:6443: connect: connection refused" node="crc" Sep 30 17:01:40 crc kubenswrapper[4772]: W0930 17:01:40.735825 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:40 crc kubenswrapper[4772]: E0930 17:01:40.735934 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:40 crc kubenswrapper[4772]: W0930 17:01:40.746537 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:40 crc kubenswrapper[4772]: E0930 17:01:40.746693 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.824025 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.904924 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"45c7447c279132768c35ed1d542aac7c77c9104c6df11787a799543e7303055d"} Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.907276 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"49de239ad5a39f8fe97bd982f496c8559d5f124b3190368f5a855ff6a84e913e"} Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.911231 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1dbee9ccd2e8744463706d8366ffe1500ba197de27d350c24df31fedd174cb0e"} Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.913832 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"05446ce331e90ec62b1e99e21cb11d1c01835fe2e312c02af175091428e1417a"} Sep 30 17:01:40 crc kubenswrapper[4772]: I0930 17:01:40.915544 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cc19825351ac6c0dbce77bdea6af78ad68c0ec217eeea07a557fb93e05147505"} Sep 30 17:01:41 crc kubenswrapper[4772]: W0930 17:01:41.185403 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:41 crc kubenswrapper[4772]: E0930 17:01:41.186176 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:41 crc kubenswrapper[4772]: E0930 17:01:41.228544 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="1.6s" Sep 30 17:01:41 crc kubenswrapper[4772]: W0930 17:01:41.341649 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:41 crc kubenswrapper[4772]: E0930 17:01:41.341756 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.493865 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.501982 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.502086 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.502103 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.502141 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:01:41 crc kubenswrapper[4772]: E0930 17:01:41.503945 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.115:6443: connect: connection refused" node="crc" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.823849 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.921640 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2" exitCode=0 Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.921783 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2"} Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.921828 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.923413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.923504 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.923524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.924126 4772 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="512af5c4c6105901abfeba383a59866e6c7593daed3f3f52aec5004b2456bec0" exitCode=0 Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.924158 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"512af5c4c6105901abfeba383a59866e6c7593daed3f3f52aec5004b2456bec0"} Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.924284 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.925897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.925954 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.925973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.926100 4772 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7cb26b902c619b0a8b18b90ab720669bd5fbb4bda0c24d38aa06141d77cfe29f" exitCode=0 Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.926232 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.926227 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7cb26b902c619b0a8b18b90ab720669bd5fbb4bda0c24d38aa06141d77cfe29f"} Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.927321 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.927700 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.927746 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.927766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.928456 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.928521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.928530 4772 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01" exitCode=0 Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.928544 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.928639 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01"} Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.928689 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.930246 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.930290 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.930312 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.932374 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45"} Sep 30 17:01:41 crc kubenswrapper[4772]: I0930 17:01:41.932433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.824035 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:42 crc kubenswrapper[4772]: E0930 17:01:42.829860 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="3.2s" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.938351 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.938403 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.938416 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.938428 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.940176 4772 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6fae35f5135da2c3880574f69aa50b5aaef39e81fcb6382484107ebac502bd85" exitCode=0 Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.940224 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6fae35f5135da2c3880574f69aa50b5aaef39e81fcb6382484107ebac502bd85"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.940356 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.941323 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.941355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.941363 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.941688 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5df798ee45454483b34381b323fbd737cd341c65028ecb28daa91980909a9c8c"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.941764 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.943600 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.943655 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.943668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.945382 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0e9c279258d92a669c674655e5259645adb62228ef7dd6ebdbbac8f18017d8f1"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.945425 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ae60a9045ba0aad9b5d97eddbe7dc92e8ddc7ccc9e369c196b1632052f720e43"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.945439 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c9146b821cb907df28cb544ccd909a8c51761fde950ae641a64707c8cdbea71f"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.945439 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.946567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.946601 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.946614 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.948811 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.948849 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66"} Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.948909 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.949684 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.949740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:42 crc kubenswrapper[4772]: I0930 17:01:42.949761 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:43 crc kubenswrapper[4772]: W0930 17:01:43.020779 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:43 crc kubenswrapper[4772]: E0930 17:01:43.020865 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:43 crc kubenswrapper[4772]: W0930 17:01:43.038662 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:43 crc kubenswrapper[4772]: E0930 17:01:43.038737 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.104148 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.105381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.105439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.105452 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.105488 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:01:43 crc kubenswrapper[4772]: E0930 17:01:43.106115 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.115:6443: connect: connection refused" node="crc" Sep 30 17:01:43 crc kubenswrapper[4772]: E0930 17:01:43.174369 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.115:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a1e199e4c8e20 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 17:01:39.820432928 +0000 UTC m=+0.727445799,LastTimestamp:2025-09-30 17:01:39.820432928 +0000 UTC m=+0.727445799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 17:01:43 crc kubenswrapper[4772]: W0930 17:01:43.221877 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:43 crc kubenswrapper[4772]: E0930 17:01:43.221980 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:43 crc kubenswrapper[4772]: W0930 17:01:43.395342 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Sep 30 17:01:43 crc kubenswrapper[4772]: E0930 17:01:43.395435 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.956157 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f"} Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.956192 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.957890 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.957937 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.957953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.958845 4772 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0ddc6a838bee7023803138b54060d19d2ac6b06752fb3f80501a9dc148dc8d0b" exitCode=0 Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.958949 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.958998 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.959010 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.959072 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.959530 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0ddc6a838bee7023803138b54060d19d2ac6b06752fb3f80501a9dc148dc8d0b"} Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.960189 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.961351 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.961390 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.961407 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.961492 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.961518 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.961543 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.961383 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.961686 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.961700 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.962015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.962048 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:43 crc kubenswrapper[4772]: I0930 17:01:43.962081 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.036834 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.967981 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dce704c8a4bf1c568cb354fa611eb0e3061361aa124d56cf57d4367322612da4"} Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.968075 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c2e3db035513d7b1f7f9b21b788a8c0658bdacf874966ba2e5ffecac0b10b730"} Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.968100 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"46ca219377de17b5477766765c63287f6c61fd55689a1815d44e55532e30d376"} Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.968117 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"918d00ff889d33b937f6e64d28db9f6596910af744b5b3afa77a870ef171bd6a"} Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.968078 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.968182 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.968305 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.969567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.969600 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.969615 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.969900 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.969957 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:44 crc kubenswrapper[4772]: I0930 17:01:44.969980 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:45 crc kubenswrapper[4772]: I0930 17:01:45.929132 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:45 crc kubenswrapper[4772]: I0930 17:01:45.978174 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c39c61e2b22b3ac7b96edd6d1bc59a236868b4cddd5ae35bc3797a7e6fcc54be"} Sep 30 17:01:45 crc kubenswrapper[4772]: I0930 17:01:45.978210 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:45 crc kubenswrapper[4772]: I0930 17:01:45.978295 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:45 crc kubenswrapper[4772]: I0930 17:01:45.979960 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:45 crc kubenswrapper[4772]: I0930 17:01:45.979986 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:45 crc kubenswrapper[4772]: I0930 17:01:45.980010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:45 crc kubenswrapper[4772]: I0930 17:01:45.980024 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:45 crc kubenswrapper[4772]: I0930 17:01:45.980043 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:45 crc kubenswrapper[4772]: I0930 17:01:45.980028 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.307008 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.309188 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.309240 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.309259 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.309297 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.484158 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.484364 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.485844 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.485887 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.485899 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.491891 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.592381 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.981956 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.982100 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.982172 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.985143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.985140 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.985204 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.985228 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.985281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.985315 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.985394 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.985438 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:46 crc kubenswrapper[4772]: I0930 17:01:46.985487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:47 crc kubenswrapper[4772]: I0930 17:01:47.563470 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 17:01:47 crc kubenswrapper[4772]: I0930 17:01:47.985150 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:01:47 crc kubenswrapper[4772]: I0930 17:01:47.985226 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:47 crc kubenswrapper[4772]: I0930 17:01:47.985233 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:47 crc kubenswrapper[4772]: I0930 17:01:47.986939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:47 crc kubenswrapper[4772]: I0930 17:01:47.987007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:47 crc kubenswrapper[4772]: I0930 17:01:47.987029 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:47 crc kubenswrapper[4772]: I0930 17:01:47.987124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:47 crc kubenswrapper[4772]: I0930 17:01:47.987188 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:47 crc kubenswrapper[4772]: I0930 17:01:47.987208 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:48 crc kubenswrapper[4772]: I0930 17:01:48.670426 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:48 crc kubenswrapper[4772]: I0930 17:01:48.670713 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:48 crc kubenswrapper[4772]: I0930 17:01:48.672296 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:48 crc kubenswrapper[4772]: I0930 17:01:48.672347 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:48 crc kubenswrapper[4772]: I0930 17:01:48.672357 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:49 crc kubenswrapper[4772]: I0930 17:01:49.593488 4772 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 17:01:49 crc kubenswrapper[4772]: I0930 17:01:49.593636 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 17:01:49 crc kubenswrapper[4772]: I0930 17:01:49.981228 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:49 crc kubenswrapper[4772]: I0930 17:01:49.981450 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:49 crc kubenswrapper[4772]: I0930 17:01:49.983281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:49 crc kubenswrapper[4772]: I0930 17:01:49.983326 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:49 crc kubenswrapper[4772]: I0930 17:01:49.983338 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:49 crc kubenswrapper[4772]: E0930 17:01:49.997000 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 17:01:50 crc kubenswrapper[4772]: I0930 17:01:50.557151 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:50 crc kubenswrapper[4772]: I0930 17:01:50.557354 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:50 crc kubenswrapper[4772]: I0930 17:01:50.558893 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:50 crc kubenswrapper[4772]: I0930 17:01:50.559175 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:50 crc kubenswrapper[4772]: I0930 17:01:50.559494 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:53 crc kubenswrapper[4772]: I0930 17:01:53.825669 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 30 17:01:54 crc kubenswrapper[4772]: I0930 17:01:54.248342 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 17:01:54 crc kubenswrapper[4772]: I0930 17:01:54.248885 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:54 crc kubenswrapper[4772]: I0930 17:01:54.250380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:54 crc kubenswrapper[4772]: I0930 17:01:54.250432 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:54 crc kubenswrapper[4772]: I0930 17:01:54.250445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:54 crc kubenswrapper[4772]: I0930 17:01:54.309821 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 17:01:54 crc kubenswrapper[4772]: I0930 17:01:54.395723 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 17:01:54 crc kubenswrapper[4772]: I0930 17:01:54.395808 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 17:01:54 crc kubenswrapper[4772]: I0930 17:01:54.404722 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 17:01:54 crc kubenswrapper[4772]: I0930 17:01:54.404818 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.004460 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.007170 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f" exitCode=255 Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.007555 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f"} Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.007963 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.008085 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.010088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.010138 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.010152 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.010097 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.010199 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.010228 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.011426 4772 scope.go:117] "RemoveContainer" containerID="3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f" Sep 30 17:01:55 crc kubenswrapper[4772]: I0930 17:01:55.028716 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 17:01:56 crc kubenswrapper[4772]: I0930 17:01:56.012550 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 17:01:56 crc kubenswrapper[4772]: I0930 17:01:56.014884 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:56 crc kubenswrapper[4772]: I0930 17:01:56.015201 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2"} Sep 30 17:01:56 crc kubenswrapper[4772]: I0930 17:01:56.015348 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:56 crc kubenswrapper[4772]: I0930 17:01:56.018497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:56 crc kubenswrapper[4772]: I0930 17:01:56.018579 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:56 crc kubenswrapper[4772]: I0930 17:01:56.018595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:56 crc kubenswrapper[4772]: I0930 17:01:56.020587 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:56 crc kubenswrapper[4772]: I0930 17:01:56.020627 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:56 crc kubenswrapper[4772]: I0930 17:01:56.020637 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:58 crc kubenswrapper[4772]: I0930 17:01:58.677729 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:58 crc kubenswrapper[4772]: I0930 17:01:58.678274 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:58 crc kubenswrapper[4772]: I0930 17:01:58.678387 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:58 crc kubenswrapper[4772]: I0930 17:01:58.683463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:58 crc kubenswrapper[4772]: I0930 17:01:58.683546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:58 crc kubenswrapper[4772]: I0930 17:01:58.683571 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:58 crc kubenswrapper[4772]: I0930 17:01:58.685112 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.023838 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.025109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.025390 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.025588 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.400486 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.402235 4772 trace.go:236] Trace[1478823671]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:01:48.455) (total time: 10946ms): Sep 30 17:01:59 crc kubenswrapper[4772]: Trace[1478823671]: ---"Objects listed" error: 10946ms (17:01:59.402) Sep 30 17:01:59 crc kubenswrapper[4772]: Trace[1478823671]: [10.94618945s] [10.94618945s] END Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.402472 4772 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.406333 4772 trace.go:236] Trace[1035014249]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:01:47.859) (total time: 11546ms): Sep 30 17:01:59 crc kubenswrapper[4772]: Trace[1035014249]: ---"Objects listed" error: 11546ms (17:01:59.406) Sep 30 17:01:59 crc kubenswrapper[4772]: Trace[1035014249]: [11.546961211s] [11.546961211s] END Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.406382 4772 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.406463 4772 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.407780 4772 trace.go:236] Trace[1618640791]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:01:48.214) (total time: 11193ms): Sep 30 17:01:59 crc kubenswrapper[4772]: Trace[1618640791]: ---"Objects listed" error: 11193ms (17:01:59.407) Sep 30 17:01:59 crc kubenswrapper[4772]: Trace[1618640791]: [11.193265492s] [11.193265492s] END Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.407977 4772 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.410458 4772 trace.go:236] Trace[1868507316]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:01:48.627) (total time: 10782ms): Sep 30 17:01:59 crc kubenswrapper[4772]: Trace[1868507316]: ---"Objects listed" error: 10782ms (17:01:59.410) Sep 30 17:01:59 crc kubenswrapper[4772]: Trace[1868507316]: [10.78246366s] [10.78246366s] END Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.410505 4772 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.410906 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.593770 4772 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.593845 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.803544 4772 apiserver.go:52] "Watching apiserver" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.808221 4772 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.808516 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.808945 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.809006 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.809089 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.809050 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.809420 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.809551 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.809906 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.809925 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.810009 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.812200 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.812257 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.812312 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.812364 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.812521 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.812530 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.816618 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.816717 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.817296 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.825678 4772 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.844372 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.878907 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.903641 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.908992 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909073 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909107 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909135 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909159 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909184 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909244 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909267 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909291 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909311 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909341 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909375 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909405 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909429 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909452 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909479 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909502 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909554 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909554 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909584 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909620 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909652 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909737 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909765 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909798 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909815 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909831 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909842 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909861 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909894 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909924 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.909973 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910077 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910118 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910145 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910170 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910195 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910223 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910251 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910296 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910325 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910346 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910351 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910370 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910515 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910547 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910572 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910594 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910598 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910616 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910639 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910660 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910670 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910683 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910713 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910740 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910767 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910796 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910793 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910823 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910837 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910850 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910876 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910899 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910935 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910961 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.910987 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911022 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911046 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911087 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911110 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911134 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911160 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911185 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911187 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911209 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911235 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911259 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911285 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911305 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911327 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911347 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911382 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911367 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911445 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911467 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911491 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911513 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911546 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911566 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911585 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911605 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911628 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911649 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911671 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911692 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911717 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911742 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911766 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911790 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911813 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911839 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911862 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911882 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911905 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911912 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911926 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911948 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911970 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.911991 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912012 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912032 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912076 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912101 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912122 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912142 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912162 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912185 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912200 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912208 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912287 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912320 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912347 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912372 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912404 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912428 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912454 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912476 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912502 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912527 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912552 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912577 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.912601 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913037 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913122 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913121 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913181 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913208 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913233 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913259 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913287 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913312 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913337 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913361 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913387 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913413 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913436 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913462 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913486 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913486 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913512 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913540 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913570 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913596 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913620 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913644 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913666 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913692 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913720 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913745 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913769 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913794 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913818 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913841 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913872 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913897 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913922 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913951 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913990 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914020 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914044 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914088 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914112 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914134 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914156 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914180 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914203 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914226 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914248 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914275 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914301 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914327 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914352 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914377 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914403 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914430 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914482 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914508 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914531 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914554 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914578 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914603 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914626 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914650 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914674 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914700 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914726 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914752 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914778 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914803 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914827 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914851 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914875 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914900 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914927 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914952 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914988 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915019 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915045 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915139 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915172 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915202 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915228 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915257 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915285 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915315 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915345 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915373 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915400 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915425 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915492 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915519 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915546 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915596 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915612 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915627 4772 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915641 4772 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915659 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915673 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915688 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915703 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915717 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915733 4772 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915746 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915760 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915778 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915792 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.916300 4772 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.948264 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952723 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.956477 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.958175 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.967306 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.967970 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.979279 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913691 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.913951 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.914935 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915031 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915348 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.979361 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915475 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915642 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.915945 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.916044 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.916282 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.979461 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.916305 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.916581 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.979553 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.916728 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.916775 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.917029 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.917070 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.917238 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.917559 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.918143 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.918638 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.918804 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.920254 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.920548 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.920543 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.920605 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.920905 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.921313 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.921466 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.921501 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.921536 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.921704 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.921758 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.921850 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.921929 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.922143 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.922176 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.922335 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.922348 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.922546 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.922761 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.922769 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.979952 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.924361 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.924989 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.925141 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.925283 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.925357 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.925413 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.925481 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.925698 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.926144 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.926277 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.926316 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.926337 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.926572 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.926817 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.927075 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.980305 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.927087 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.927299 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.927298 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.947467 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.947505 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.947517 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.947794 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.947950 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.948554 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.980452 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.980468 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.948719 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.950232 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.950313 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.950323 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.950525 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.950538 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.950837 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.951205 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.951253 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.951683 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.951785 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952093 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952199 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952348 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952581 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952576 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952586 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952650 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952768 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952779 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952862 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.954341 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.954839 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.955139 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.955707 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.956420 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.952228 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.962122 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.962513 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.963020 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.963300 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.963436 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.963536 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:02:00.463507712 +0000 UTC m=+21.370520533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.980932 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:00.480912223 +0000 UTC m=+21.387925054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.980959 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:00.480953494 +0000 UTC m=+21.387966325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.980971 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:00.480967084 +0000 UTC m=+21.387979905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.963718 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.964050 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.964212 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.964495 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.964710 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.964879 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.965460 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.966272 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.966336 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.966477 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.981278 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.981291 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.966725 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.966769 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.967079 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.967389 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.967532 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.967498 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.967606 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.967741 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.967907 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.969479 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.962729 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.972568 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.972581 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.972621 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.972668 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.973563 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.973573 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.973778 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.974313 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.974332 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.974465 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.974607 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.974690 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.974721 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.974983 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.975021 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.975042 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.975116 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.975917 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.976297 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.976624 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.978804 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.978827 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.979264 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.966808 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.967241 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.981721 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.981783 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.982280 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.982449 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.982807 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.982844 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.983168 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.983249 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.983432 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.983613 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.983666 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.983857 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.984082 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.984342 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.984579 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.985349 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.985353 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.988182 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.985365 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.985645 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.987401 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.988394 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.988410 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:01:59 crc kubenswrapper[4772]: E0930 17:01:59.988468 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:00.488445745 +0000 UTC m=+21.395458576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.992531 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.992924 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.995843 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.996222 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.996895 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.997560 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.997790 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.999455 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:01:59 crc kubenswrapper[4772]: I0930 17:01:59.999649 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.000793 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.001279 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.001315 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.003373 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.003379 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016557 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016597 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016670 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016680 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016693 4772 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016703 4772 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016712 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016721 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016729 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016738 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016751 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016760 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016768 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016777 4772 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016785 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016795 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016804 4772 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016812 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016821 4772 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016830 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016839 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016847 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016856 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016865 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016873 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016881 4772 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016890 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016899 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016917 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016926 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016946 4772 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016955 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016963 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016972 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016981 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016990 4772 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.016999 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017008 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017020 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017029 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017038 4772 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017046 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017071 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017080 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017089 4772 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017100 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017109 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017117 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017126 4772 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017135 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017144 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017152 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017161 4772 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017170 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017179 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017187 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017196 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017204 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017213 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017222 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017231 4772 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017248 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017256 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017266 4772 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017275 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017283 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017293 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017301 4772 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017310 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017318 4772 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017328 4772 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017336 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017344 4772 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017353 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017363 4772 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017373 4772 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017384 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017393 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017402 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017410 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017420 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017428 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017438 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017446 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017453 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017461 4772 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017469 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017479 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017487 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017495 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017511 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017521 4772 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017528 4772 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017537 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017545 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017555 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017563 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017571 4772 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017580 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017590 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017598 4772 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017606 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017615 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017623 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017630 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017639 4772 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017647 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017654 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017662 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017670 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017680 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017689 4772 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017698 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017707 4772 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017715 4772 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017723 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017732 4772 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017741 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017749 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017760 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017769 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017778 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017787 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017795 4772 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017804 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017814 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017823 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017831 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017849 4772 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017858 4772 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017866 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017874 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017883 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017891 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017900 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017908 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017917 4772 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017925 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017933 4772 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017941 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017949 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017957 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017966 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017974 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017982 4772 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017990 4772 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.017999 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018007 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018015 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018024 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018034 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018043 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018052 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018084 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018092 4772 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018102 4772 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018111 4772 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018121 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018130 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018139 4772 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018762 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.018840 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019430 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019832 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019844 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019853 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019862 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019872 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019881 4772 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019891 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019900 4772 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019909 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019918 4772 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019927 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019936 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019945 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019953 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019962 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019971 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019980 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.019989 4772 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.020030 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.021619 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.025919 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.027237 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.030821 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.046259 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.063774 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.075428 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.093145 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.104638 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.108516 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.120910 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.120947 4772 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.120958 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.122453 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.123464 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.132539 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.134294 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.137700 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.145584 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.158852 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.172922 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.189235 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.201329 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.214722 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.235025 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.249919 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.278469 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.480037 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-k2jvh"] Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.480494 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k2jvh" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.482950 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.482957 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.483036 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.494132 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.507318 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.516261 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.525199 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.525280 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.525308 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.525331 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.525351 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525611 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525630 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:02:01.525400406 +0000 UTC m=+22.432413247 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525670 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:01.525660203 +0000 UTC m=+22.432673034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525775 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525815 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525831 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525878 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525943 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525966 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525901 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:01.52587711 +0000 UTC m=+22.432889991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.525901 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.526094 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:01.526035884 +0000 UTC m=+22.433048715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:00 crc kubenswrapper[4772]: E0930 17:02:00.526128 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:01.526116876 +0000 UTC m=+22.433129907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.528800 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.545173 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.558891 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.578639 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.597687 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.612858 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.626213 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f081528-51e8-4088-bb5c-f51e7ab0bc7a-hosts-file\") pod \"node-resolver-k2jvh\" (UID: \"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\") " pod="openshift-dns/node-resolver-k2jvh" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.626299 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fqk6\" (UniqueName: \"kubernetes.io/projected/1f081528-51e8-4088-bb5c-f51e7ab0bc7a-kube-api-access-8fqk6\") pod \"node-resolver-k2jvh\" (UID: \"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\") " pod="openshift-dns/node-resolver-k2jvh" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.727978 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f081528-51e8-4088-bb5c-f51e7ab0bc7a-hosts-file\") pod \"node-resolver-k2jvh\" (UID: \"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\") " pod="openshift-dns/node-resolver-k2jvh" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.728037 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fqk6\" (UniqueName: \"kubernetes.io/projected/1f081528-51e8-4088-bb5c-f51e7ab0bc7a-kube-api-access-8fqk6\") pod \"node-resolver-k2jvh\" (UID: \"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\") " pod="openshift-dns/node-resolver-k2jvh" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.728217 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f081528-51e8-4088-bb5c-f51e7ab0bc7a-hosts-file\") pod \"node-resolver-k2jvh\" (UID: \"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\") " pod="openshift-dns/node-resolver-k2jvh" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.818803 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fqk6\" (UniqueName: \"kubernetes.io/projected/1f081528-51e8-4088-bb5c-f51e7ab0bc7a-kube-api-access-8fqk6\") pod \"node-resolver-k2jvh\" (UID: \"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\") " pod="openshift-dns/node-resolver-k2jvh" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.864250 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-7br52"] Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.864643 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7br52" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.866851 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.867079 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.867305 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.867409 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.867811 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.869569 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-rkhll"] Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.869918 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-47rqk"] Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.870407 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.870741 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.872202 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bj99l"] Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.872892 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.873012 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.873116 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.873116 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.873557 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.873741 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.883762 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.883885 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.884000 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.884124 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.884116 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.884270 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.884511 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.884586 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.884712 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.884789 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.899884 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.916696 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.936995 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.952363 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.970236 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:00 crc kubenswrapper[4772]: I0930 17:02:00.999152 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.031720 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-kubelet\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.031767 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-log-socket\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.031790 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-script-lib\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.031820 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-system-cni-dir\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.031842 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-etc-openvswitch\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.031867 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-netd\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.031907 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-config\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032003 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-run-multus-certs\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032086 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-etc-kubernetes\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032110 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-cni-dir\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032127 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032188 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-systemd\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032272 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-ovn\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032342 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032374 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-slash\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032391 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-openvswitch\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032419 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-cnibin\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032437 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-cni-binary-copy\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032459 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-proxy-tls\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032480 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-systemd-units\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032498 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-netns\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032540 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-var-lib-cni-bin\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032558 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-cnibin\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-ovn-kubernetes\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032716 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-os-release\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032753 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032895 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-run-netns\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032955 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-conf-dir\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.032978 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr9kl\" (UniqueName: \"kubernetes.io/projected/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-kube-api-access-hr9kl\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033001 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033026 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-daemon-config\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033046 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033082 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-node-log\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033116 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-system-cni-dir\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033142 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-socket-dir-parent\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033161 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2f7\" (UniqueName: \"kubernetes.io/projected/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-kube-api-access-8b2f7\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033183 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-var-lib-openvswitch\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033223 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-env-overrides\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033242 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47daa5db-853e-45af-98ae-489980c97641-ovn-node-metrics-cert\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033293 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-bin\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033314 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-var-lib-cni-multus\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033332 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-var-lib-kubelet\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033354 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-rootfs\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033384 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27g86\" (UniqueName: \"kubernetes.io/projected/47daa5db-853e-45af-98ae-489980c97641-kube-api-access-27g86\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033411 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-os-release\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033429 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-hostroot\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033453 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrxtr\" (UniqueName: \"kubernetes.io/projected/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-kube-api-access-wrxtr\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.033488 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-run-k8s-cni-cncf-io\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.034847 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.052377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd"} Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.052430 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b"} Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.052444 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f44236783eeebee776aa7a3416b9d29b6c732cf7f8f8bc8bd01520207cca5128"} Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.053302 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"aa55da263415cb7a385aea83baa8bc25a896cbc40564f0ce8acfb39d62b85d68"} Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.054721 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3"} Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.054750 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"46827a7a38a4a6f9f02863acafb6522b998c6d27d3ce1e4ba1945282de8ffb5b"} Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.058811 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.078489 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.094228 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k2jvh" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.102027 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: W0930 17:02:01.112083 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f081528_51e8_4088_bb5c_f51e7ab0bc7a.slice/crio-ebe919c36de737d1c7f9c81f3de8161ca55c46dfc7bc58d45cbc788e8ee67f2f WatchSource:0}: Error finding container ebe919c36de737d1c7f9c81f3de8161ca55c46dfc7bc58d45cbc788e8ee67f2f: Status 404 returned error can't find the container with id ebe919c36de737d1c7f9c81f3de8161ca55c46dfc7bc58d45cbc788e8ee67f2f Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.118508 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.135812 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-daemon-config\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136146 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136238 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-node-log\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136309 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-system-cni-dir\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136386 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-socket-dir-parent\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136445 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-socket-dir-parent\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136365 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-node-log\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136561 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-system-cni-dir\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136584 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2f7\" (UniqueName: \"kubernetes.io/projected/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-kube-api-access-8b2f7\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136682 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-var-lib-openvswitch\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136712 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-env-overrides\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136734 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47daa5db-853e-45af-98ae-489980c97641-ovn-node-metrics-cert\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136732 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-daemon-config\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136756 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-bin\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136780 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-var-lib-cni-multus\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136799 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-var-lib-kubelet\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136817 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-rootfs\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136835 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27g86\" (UniqueName: \"kubernetes.io/projected/47daa5db-853e-45af-98ae-489980c97641-kube-api-access-27g86\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136862 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-os-release\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136887 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-hostroot\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136898 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-var-lib-kubelet\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136906 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrxtr\" (UniqueName: \"kubernetes.io/projected/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-kube-api-access-wrxtr\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136950 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-run-k8s-cni-cncf-io\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136971 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-var-lib-cni-multus\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137024 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-kubelet\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137047 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-log-socket\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137036 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137086 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-script-lib\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137182 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-system-cni-dir\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137211 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-etc-openvswitch\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137225 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-run-k8s-cni-cncf-io\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137237 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-netd\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137273 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-netd\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137293 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-config\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137311 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-system-cni-dir\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137325 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-run-multus-certs\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.136952 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-bin\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137371 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-etc-kubernetes\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137376 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-hostroot\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137351 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-etc-kubernetes\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137344 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-etc-openvswitch\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137627 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-cni-dir\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137688 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-run-multus-certs\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137734 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-kubelet\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137771 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-log-socket\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137784 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-env-overrides\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137831 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-rootfs\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137405 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-cni-dir\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137961 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.137999 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-systemd\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138032 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-ovn\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138091 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-os-release\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138092 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138147 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-slash\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138145 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-ovn\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138173 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-openvswitch\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138202 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-systemd\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138214 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-cnibin\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138238 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-cni-binary-copy\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138260 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-proxy-tls\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138285 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-systemd-units\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138309 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-netns\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138336 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-var-lib-cni-bin\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138361 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-cnibin\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138385 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-ovn-kubernetes\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138428 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-os-release\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138454 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138488 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138592 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-run-netns\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138656 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-conf-dir\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138685 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr9kl\" (UniqueName: \"kubernetes.io/projected/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-kube-api-access-hr9kl\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138765 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138809 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138828 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-openvswitch\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138873 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-cnibin\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138864 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-slash\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138954 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-run-netns\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.138981 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-multus-conf-dir\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139001 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-cnibin\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139032 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-systemd-units\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139035 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-ovn-kubernetes\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139073 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-os-release\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139085 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139087 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-netns\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139108 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-host-var-lib-cni-bin\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139234 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139538 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-cni-binary-copy\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139763 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-script-lib\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.139819 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.140326 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-config\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.140473 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-var-lib-openvswitch\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.140954 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47daa5db-853e-45af-98ae-489980c97641-ovn-node-metrics-cert\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.142447 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-proxy-tls\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.159674 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27g86\" (UniqueName: \"kubernetes.io/projected/47daa5db-853e-45af-98ae-489980c97641-kube-api-access-27g86\") pod \"ovnkube-node-bj99l\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.162440 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.163753 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2f7\" (UniqueName: \"kubernetes.io/projected/8e885147-8bd5-4c7a-9331-ec1f4eebd3f7-kube-api-access-8b2f7\") pod \"machine-config-daemon-rkhll\" (UID: \"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\") " pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.167017 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrxtr\" (UniqueName: \"kubernetes.io/projected/ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7-kube-api-access-wrxtr\") pod \"multus-additional-cni-plugins-47rqk\" (UID: \"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\") " pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.167394 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr9kl\" (UniqueName: \"kubernetes.io/projected/5e5b90d4-3f5e-49d8-b2c5-175948eeeda6-kube-api-access-hr9kl\") pod \"multus-7br52\" (UID: \"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\") " pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.180786 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.183972 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7br52" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.190525 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.197148 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-47rqk" Sep 30 17:02:01 crc kubenswrapper[4772]: W0930 17:02:01.197621 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e5b90d4_3f5e_49d8_b2c5_175948eeeda6.slice/crio-52095f44c082353ba1d7275ee3955c57b54427c9ecf7fc21887778d213c7650a WatchSource:0}: Error finding container 52095f44c082353ba1d7275ee3955c57b54427c9ecf7fc21887778d213c7650a: Status 404 returned error can't find the container with id 52095f44c082353ba1d7275ee3955c57b54427c9ecf7fc21887778d213c7650a Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.203942 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.203920 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.228271 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: W0930 17:02:01.233014 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e885147_8bd5_4c7a_9331_ec1f4eebd3f7.slice/crio-beb7a445b97d180faebea2f26214514d0b1839315a1f6e9ce4665e7f3b7eb2c5 WatchSource:0}: Error finding container beb7a445b97d180faebea2f26214514d0b1839315a1f6e9ce4665e7f3b7eb2c5: Status 404 returned error can't find the container with id beb7a445b97d180faebea2f26214514d0b1839315a1f6e9ce4665e7f3b7eb2c5 Sep 30 17:02:01 crc kubenswrapper[4772]: W0930 17:02:01.233846 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec22019d_863b_4e4b_98a9_1ceaa9fbd9f7.slice/crio-d9eda59442191493dbfc74059da3b4fe6a2eaf1c6735ba034203f09a8d10e620 WatchSource:0}: Error finding container d9eda59442191493dbfc74059da3b4fe6a2eaf1c6735ba034203f09a8d10e620: Status 404 returned error can't find the container with id d9eda59442191493dbfc74059da3b4fe6a2eaf1c6735ba034203f09a8d10e620 Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.258330 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.289512 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.314603 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.330127 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.345722 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.358131 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.542543 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.542625 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.542660 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.542684 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.542712 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.542829 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.542894 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:03.542873806 +0000 UTC m=+24.449886637 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543201 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543301 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:03.543279677 +0000 UTC m=+24.450292508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543326 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:02:03.543316208 +0000 UTC m=+24.450329039 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543658 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543695 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543708 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543760 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:03.54374402 +0000 UTC m=+24.450756851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543660 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543785 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543792 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.543816 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:03.543810372 +0000 UTC m=+24.450823203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.897698 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.897706 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.898313 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.897865 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.898433 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:01 crc kubenswrapper[4772]: E0930 17:02:01.898548 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.901707 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.902487 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.903586 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.904240 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.905200 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.905748 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.906745 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.907291 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.908362 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.908965 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.909594 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.911661 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.912877 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.914218 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.914784 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.916043 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.916712 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.917140 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.918157 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.918775 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.919462 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.920821 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.922312 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.923380 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.924153 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.925260 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.925896 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.926882 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.927579 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.928166 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.929072 4772 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.929176 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.930811 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.931705 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.932141 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.933575 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.934601 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.935145 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.936105 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.936739 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.937226 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.938163 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.939161 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.939800 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.940718 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.941240 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.942118 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.942843 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.943667 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.944562 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.945006 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.945898 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.946455 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 17:02:01 crc kubenswrapper[4772]: I0930 17:02:01.947269 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.060033 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7br52" event={"ID":"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6","Type":"ContainerStarted","Data":"6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.060110 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7br52" event={"ID":"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6","Type":"ContainerStarted","Data":"52095f44c082353ba1d7275ee3955c57b54427c9ecf7fc21887778d213c7650a"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.061973 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k2jvh" event={"ID":"1f081528-51e8-4088-bb5c-f51e7ab0bc7a","Type":"ContainerStarted","Data":"6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.062000 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k2jvh" event={"ID":"1f081528-51e8-4088-bb5c-f51e7ab0bc7a","Type":"ContainerStarted","Data":"ebe919c36de737d1c7f9c81f3de8161ca55c46dfc7bc58d45cbc788e8ee67f2f"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.063380 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c" exitCode=0 Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.063424 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.063475 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"47465566be7e0751b90cd0d519586f43f7758fc8174172bd466eea0feebee39b"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.064982 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" event={"ID":"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7","Type":"ContainerStarted","Data":"8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.065008 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" event={"ID":"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7","Type":"ContainerStarted","Data":"d9eda59442191493dbfc74059da3b4fe6a2eaf1c6735ba034203f09a8d10e620"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.068320 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.068372 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.068383 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"beb7a445b97d180faebea2f26214514d0b1839315a1f6e9ce4665e7f3b7eb2c5"} Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.081962 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.094702 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.112027 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.132645 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.152634 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.167015 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.179800 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.194837 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.208559 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.223088 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.243784 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.258001 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.272866 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.289415 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.305529 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.323880 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.337676 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.352313 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.365827 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.411862 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.471684 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.487516 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.502747 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.517039 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.532904 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:02 crc kubenswrapper[4772]: I0930 17:02:02.557613 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.076458 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8"} Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.076528 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f"} Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.076559 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b"} Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.076573 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70"} Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.076584 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63"} Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.076598 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9"} Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.078501 4772 generic.go:334] "Generic (PLEG): container finished" podID="ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7" containerID="8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4" exitCode=0 Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.078585 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" event={"ID":"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7","Type":"ContainerDied","Data":"8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4"} Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.081541 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4"} Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.096983 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.111904 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.131364 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.147231 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.164727 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.181478 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.197049 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.212728 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.226962 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.240106 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.263823 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.276699 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.290835 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.305608 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.319459 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.335419 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.348598 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.363915 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.378861 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.391132 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.410471 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.425417 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.442976 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.460458 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.476833 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.492784 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.553802 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-j5z7n"] Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.554266 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.556220 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.556227 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.557191 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.557711 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.560533 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.560641 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.560690 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.560750 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:02:07.560697859 +0000 UTC m=+28.467710690 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.560793 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.560844 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.560951 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.560957 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.560994 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.561008 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.561032 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:07.561008928 +0000 UTC m=+28.468021779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.561031 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.561038 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.561091 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:07.561045199 +0000 UTC m=+28.468058040 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.561102 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.561125 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.561166 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:07.561140382 +0000 UTC m=+28.468153233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.561196 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:07.561185353 +0000 UTC m=+28.468198194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.570782 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.582638 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.595766 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.605129 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.622028 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.643343 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.655527 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.662281 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82faca8b-622c-4731-a320-ff2bc04d040b-host\") pod \"node-ca-j5z7n\" (UID: \"82faca8b-622c-4731-a320-ff2bc04d040b\") " pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.662366 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/82faca8b-622c-4731-a320-ff2bc04d040b-serviceca\") pod \"node-ca-j5z7n\" (UID: \"82faca8b-622c-4731-a320-ff2bc04d040b\") " pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.662398 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m78s6\" (UniqueName: \"kubernetes.io/projected/82faca8b-622c-4731-a320-ff2bc04d040b-kube-api-access-m78s6\") pod \"node-ca-j5z7n\" (UID: \"82faca8b-622c-4731-a320-ff2bc04d040b\") " pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.666974 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.694081 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.705239 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.720501 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.761514 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.764282 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82faca8b-622c-4731-a320-ff2bc04d040b-host\") pod \"node-ca-j5z7n\" (UID: \"82faca8b-622c-4731-a320-ff2bc04d040b\") " pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.764346 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/82faca8b-622c-4731-a320-ff2bc04d040b-serviceca\") pod \"node-ca-j5z7n\" (UID: \"82faca8b-622c-4731-a320-ff2bc04d040b\") " pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.764376 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m78s6\" (UniqueName: \"kubernetes.io/projected/82faca8b-622c-4731-a320-ff2bc04d040b-kube-api-access-m78s6\") pod \"node-ca-j5z7n\" (UID: \"82faca8b-622c-4731-a320-ff2bc04d040b\") " pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.764516 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82faca8b-622c-4731-a320-ff2bc04d040b-host\") pod \"node-ca-j5z7n\" (UID: \"82faca8b-622c-4731-a320-ff2bc04d040b\") " pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.766052 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/82faca8b-622c-4731-a320-ff2bc04d040b-serviceca\") pod \"node-ca-j5z7n\" (UID: \"82faca8b-622c-4731-a320-ff2bc04d040b\") " pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.798838 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m78s6\" (UniqueName: \"kubernetes.io/projected/82faca8b-622c-4731-a320-ff2bc04d040b-kube-api-access-m78s6\") pod \"node-ca-j5z7n\" (UID: \"82faca8b-622c-4731-a320-ff2bc04d040b\") " pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.818306 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.834400 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.860629 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j5z7n" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.898213 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.898269 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:03 crc kubenswrapper[4772]: I0930 17:02:03.898279 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.898407 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.898525 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:03 crc kubenswrapper[4772]: E0930 17:02:03.898653 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.088886 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j5z7n" event={"ID":"82faca8b-622c-4731-a320-ff2bc04d040b","Type":"ContainerStarted","Data":"a2e2c4677bb90125d37a97771de5b62dcddfc4302aad2cd954b361057070df10"} Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.091162 4772 generic.go:334] "Generic (PLEG): container finished" podID="ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7" containerID="f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc" exitCode=0 Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.091241 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" event={"ID":"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7","Type":"ContainerDied","Data":"f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc"} Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.108029 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.122363 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.138081 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.152597 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.167666 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.183446 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.198836 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.215700 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.245384 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.260210 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.276920 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.296829 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.314682 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:04 crc kubenswrapper[4772]: I0930 17:02:04.329418 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.098627 4772 generic.go:334] "Generic (PLEG): container finished" podID="ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7" containerID="78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441" exitCode=0 Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.098736 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" event={"ID":"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7","Type":"ContainerDied","Data":"78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441"} Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.104762 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j5z7n" event={"ID":"82faca8b-622c-4731-a320-ff2bc04d040b","Type":"ContainerStarted","Data":"0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c"} Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.118344 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.136877 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.159999 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.181301 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.197677 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.217014 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.228961 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.241963 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.261045 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.271636 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.283783 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.296773 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.310164 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.325908 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.342010 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.358492 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.374280 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.390267 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.409848 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.422579 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.438120 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.454472 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.467025 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.481015 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.494303 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.506189 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.518961 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.539563 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.811774 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.814966 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.815027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.815046 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.815245 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.835289 4772 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.835728 4772 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.838015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.838097 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.838120 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.838157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.838177 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:05Z","lastTransitionTime":"2025-09-30T17:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:05 crc kubenswrapper[4772]: E0930 17:02:05.874401 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.880236 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.880297 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.880313 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.880333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.880347 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:05Z","lastTransitionTime":"2025-09-30T17:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.897340 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:05 crc kubenswrapper[4772]: E0930 17:02:05.897526 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.897692 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:05 crc kubenswrapper[4772]: E0930 17:02:05.897832 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.898046 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:05 crc kubenswrapper[4772]: E0930 17:02:05.898204 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:05 crc kubenswrapper[4772]: E0930 17:02:05.905283 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.913920 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.914012 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.914038 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.914233 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.914296 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:05Z","lastTransitionTime":"2025-09-30T17:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:05 crc kubenswrapper[4772]: E0930 17:02:05.935463 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.941956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.942011 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.942027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.942048 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.942081 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:05Z","lastTransitionTime":"2025-09-30T17:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:05 crc kubenswrapper[4772]: E0930 17:02:05.959993 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.966942 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.966989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.967002 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.967026 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.967046 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:05Z","lastTransitionTime":"2025-09-30T17:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:05 crc kubenswrapper[4772]: E0930 17:02:05.987339 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:05 crc kubenswrapper[4772]: E0930 17:02:05.987520 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.989811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.989877 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.989889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.989912 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:05 crc kubenswrapper[4772]: I0930 17:02:05.989927 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:05Z","lastTransitionTime":"2025-09-30T17:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.092838 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.092888 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.092900 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.092919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.092932 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:06Z","lastTransitionTime":"2025-09-30T17:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.113909 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.118760 4772 generic.go:334] "Generic (PLEG): container finished" podID="ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7" containerID="7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe" exitCode=0 Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.118863 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" event={"ID":"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7","Type":"ContainerDied","Data":"7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.138756 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.163641 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.185349 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.195983 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.196029 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.196041 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.196088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.196102 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:06Z","lastTransitionTime":"2025-09-30T17:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.204745 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.223125 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.238600 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.263242 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.276982 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.299869 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.299927 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.299942 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.299962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.299975 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:06Z","lastTransitionTime":"2025-09-30T17:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.301143 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.315507 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.332398 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.349480 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.364932 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.382454 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.405438 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.405533 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.405547 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.405578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.405591 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:06Z","lastTransitionTime":"2025-09-30T17:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.509614 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.510092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.510231 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.510364 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.510487 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:06Z","lastTransitionTime":"2025-09-30T17:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.598932 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.603644 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.613494 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.613544 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.613562 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.613586 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.613604 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:06Z","lastTransitionTime":"2025-09-30T17:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.617861 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.638541 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.655994 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.680733 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.693262 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.711863 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.716174 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.716417 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.716576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.716733 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.716885 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:06Z","lastTransitionTime":"2025-09-30T17:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.730137 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.746022 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.763846 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.790153 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.807924 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.819620 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.819666 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.819678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.819697 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.819711 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:06Z","lastTransitionTime":"2025-09-30T17:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.828882 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.850716 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.867818 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.907978 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.922599 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.922652 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.922667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.922690 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.922709 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:06Z","lastTransitionTime":"2025-09-30T17:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.932277 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.958718 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.970514 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:06 crc kubenswrapper[4772]: I0930 17:02:06.985180 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.000641 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.016555 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.026045 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.026093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.026103 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.026122 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.026137 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:07Z","lastTransitionTime":"2025-09-30T17:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.030694 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.060923 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.078456 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.098136 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.113886 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.128447 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.128498 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.128507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.128524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.128538 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:07Z","lastTransitionTime":"2025-09-30T17:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.129146 4772 generic.go:334] "Generic (PLEG): container finished" podID="ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7" containerID="9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0" exitCode=0 Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.129276 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" event={"ID":"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7","Type":"ContainerDied","Data":"9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0"} Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.129360 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.156948 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.177051 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.197794 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.216315 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.232725 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.232797 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.232814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.233194 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.233265 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:07Z","lastTransitionTime":"2025-09-30T17:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.238039 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.257886 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.281848 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.316993 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.336096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.336160 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.336175 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.336201 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.336215 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:07Z","lastTransitionTime":"2025-09-30T17:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.357376 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.397736 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.432636 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.438951 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.439006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.439016 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.439033 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.439046 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:07Z","lastTransitionTime":"2025-09-30T17:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.476328 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.515761 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.541711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.541775 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.541785 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.541801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.541814 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:07Z","lastTransitionTime":"2025-09-30T17:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.555555 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.597102 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.607364 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.607448 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.607482 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.607502 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.607526 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607646 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607657 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607707 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:15.60769134 +0000 UTC m=+36.514704171 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607751 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:15.607724751 +0000 UTC m=+36.514737592 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607786 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607831 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607850 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607864 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607938 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607942 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:15.607914156 +0000 UTC m=+36.514927187 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.607957 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.608043 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:02:15.608016599 +0000 UTC m=+36.515029430 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.608082 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:15.60807165 +0000 UTC m=+36.515084731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.646286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.646358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.646374 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.646403 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.646419 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:07Z","lastTransitionTime":"2025-09-30T17:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.749511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.749577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.749589 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.749608 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.749618 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:07Z","lastTransitionTime":"2025-09-30T17:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.852779 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.852815 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.852824 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.852848 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.852860 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:07Z","lastTransitionTime":"2025-09-30T17:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.897565 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.897620 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.897817 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.897913 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.898142 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:07 crc kubenswrapper[4772]: E0930 17:02:07.898245 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.955357 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.955612 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.955775 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.956023 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:07 crc kubenswrapper[4772]: I0930 17:02:07.956238 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:07Z","lastTransitionTime":"2025-09-30T17:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.059824 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.059861 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.059874 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.059893 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.059904 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:08Z","lastTransitionTime":"2025-09-30T17:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.142240 4772 generic.go:334] "Generic (PLEG): container finished" podID="ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7" containerID="10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded" exitCode=0 Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.142346 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" event={"ID":"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7","Type":"ContainerDied","Data":"10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.153778 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.154462 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.154526 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.154673 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.157373 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.161818 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.161856 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.161867 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.161886 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.161898 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:08Z","lastTransitionTime":"2025-09-30T17:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.177249 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.188497 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.188658 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.200142 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.217704 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.235250 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.253110 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.265545 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.265604 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.265622 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.265644 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.265661 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:08Z","lastTransitionTime":"2025-09-30T17:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.269211 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.286821 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.300728 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.314097 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.331152 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.344515 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.366611 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.368992 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.369046 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.369089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.369113 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.369132 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:08Z","lastTransitionTime":"2025-09-30T17:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.393532 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.413799 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.437795 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.450639 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.464848 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.471782 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.471847 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.471864 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.471891 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.471904 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:08Z","lastTransitionTime":"2025-09-30T17:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.486993 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.499443 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.515092 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.535476 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.555387 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.571909 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.573929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.573989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.574009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.574034 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.574079 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:08Z","lastTransitionTime":"2025-09-30T17:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.594030 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.637225 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.677304 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.677369 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.677386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.677410 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.677426 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:08Z","lastTransitionTime":"2025-09-30T17:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.687746 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.717526 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.780396 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.780467 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.780492 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.780520 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.780541 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:08Z","lastTransitionTime":"2025-09-30T17:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.887584 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.887685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.887710 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.887867 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.887907 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:08Z","lastTransitionTime":"2025-09-30T17:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.992659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.992726 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.992744 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.992771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:08 crc kubenswrapper[4772]: I0930 17:02:08.992790 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:08Z","lastTransitionTime":"2025-09-30T17:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.096397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.096459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.096474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.096498 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.096514 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:09Z","lastTransitionTime":"2025-09-30T17:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.162141 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" event={"ID":"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7","Type":"ContainerStarted","Data":"98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.179308 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.197177 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.199527 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.199591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.199605 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.199629 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.199645 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:09Z","lastTransitionTime":"2025-09-30T17:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.216196 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.236324 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.256392 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.274163 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.294103 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.302109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.302166 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.302181 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.302221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.302237 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:09Z","lastTransitionTime":"2025-09-30T17:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.313730 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.337477 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.354817 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.386042 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.405100 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.405153 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.405171 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.405208 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.405233 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:09Z","lastTransitionTime":"2025-09-30T17:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.414464 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.434499 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.451982 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.508576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.508626 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.508639 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.508662 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.508678 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:09Z","lastTransitionTime":"2025-09-30T17:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.611660 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.611753 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.611769 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.611792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.611808 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:09Z","lastTransitionTime":"2025-09-30T17:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.714582 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.714647 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.714660 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.714681 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.714702 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:09Z","lastTransitionTime":"2025-09-30T17:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.818007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.818099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.818119 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.818146 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.818163 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:09Z","lastTransitionTime":"2025-09-30T17:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.897857 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.897908 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.898201 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:09 crc kubenswrapper[4772]: E0930 17:02:09.898306 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:09 crc kubenswrapper[4772]: E0930 17:02:09.898140 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:09 crc kubenswrapper[4772]: E0930 17:02:09.898457 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.917665 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.921813 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.921887 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.921911 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.921943 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.921969 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:09Z","lastTransitionTime":"2025-09-30T17:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.943480 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:09 crc kubenswrapper[4772]: I0930 17:02:09.976779 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.002010 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.018392 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.025013 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.025080 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.025093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.025114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.025131 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:10Z","lastTransitionTime":"2025-09-30T17:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.038851 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.057104 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.081798 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.100229 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.115151 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.127881 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.127950 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.127976 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.128008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.128034 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:10Z","lastTransitionTime":"2025-09-30T17:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.136302 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.151506 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.169275 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.198699 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.231373 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.231427 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.231448 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.231477 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.231499 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:10Z","lastTransitionTime":"2025-09-30T17:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.333950 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.333997 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.334011 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.334032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.334050 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:10Z","lastTransitionTime":"2025-09-30T17:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.438093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.438170 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.438194 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.438225 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.438254 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:10Z","lastTransitionTime":"2025-09-30T17:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.482570 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.503442 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.522521 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.542012 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.542087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.542104 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.542129 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.542147 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:10Z","lastTransitionTime":"2025-09-30T17:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.543535 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.567203 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.603914 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.625050 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.645551 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.645636 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.645667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.645689 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.645701 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:10Z","lastTransitionTime":"2025-09-30T17:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.649189 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.678380 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.701587 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.724146 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.736851 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.748281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.748341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.748357 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.748380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.748392 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:10Z","lastTransitionTime":"2025-09-30T17:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.753259 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.770216 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.782573 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.851370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.851800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.851901 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.851996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.852097 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:10Z","lastTransitionTime":"2025-09-30T17:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.955311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.955366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.955376 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.955393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:10 crc kubenswrapper[4772]: I0930 17:02:10.955402 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:10Z","lastTransitionTime":"2025-09-30T17:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.058824 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.058866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.058875 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.058890 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.058903 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:11Z","lastTransitionTime":"2025-09-30T17:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.162524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.162584 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.162604 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.162626 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.162641 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:11Z","lastTransitionTime":"2025-09-30T17:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.265455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.265498 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.265509 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.265528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.265542 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:11Z","lastTransitionTime":"2025-09-30T17:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.368886 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.368931 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.368941 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.368959 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.368971 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:11Z","lastTransitionTime":"2025-09-30T17:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.471902 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.471945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.471956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.471976 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.471985 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:11Z","lastTransitionTime":"2025-09-30T17:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.575211 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.575251 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.575261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.575278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.575288 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:11Z","lastTransitionTime":"2025-09-30T17:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.677865 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.677897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.677905 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.677920 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.677929 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:11Z","lastTransitionTime":"2025-09-30T17:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.780322 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.780390 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.780416 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.780433 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.780446 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:11Z","lastTransitionTime":"2025-09-30T17:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.883168 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.883232 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.883248 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.883274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.883288 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:11Z","lastTransitionTime":"2025-09-30T17:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.897920 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.897973 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.898042 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:11 crc kubenswrapper[4772]: E0930 17:02:11.898225 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:11 crc kubenswrapper[4772]: E0930 17:02:11.898343 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:11 crc kubenswrapper[4772]: E0930 17:02:11.898561 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.985973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.986432 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.986450 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.986480 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:11 crc kubenswrapper[4772]: I0930 17:02:11.986500 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:11Z","lastTransitionTime":"2025-09-30T17:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.089855 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.089891 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.089901 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.089920 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.089930 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:12Z","lastTransitionTime":"2025-09-30T17:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.192397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.192451 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.192463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.192480 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.192492 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:12Z","lastTransitionTime":"2025-09-30T17:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.295857 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.295913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.295927 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.295949 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.295963 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:12Z","lastTransitionTime":"2025-09-30T17:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.399444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.399495 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.399507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.399524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.399535 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:12Z","lastTransitionTime":"2025-09-30T17:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.503010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.503130 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.503150 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.503179 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.503199 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:12Z","lastTransitionTime":"2025-09-30T17:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.607989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.608038 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.608050 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.608083 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.608096 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:12Z","lastTransitionTime":"2025-09-30T17:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.711783 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.711835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.711847 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.711868 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.711881 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:12Z","lastTransitionTime":"2025-09-30T17:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.814678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.814752 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.814764 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.814784 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.814798 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:12Z","lastTransitionTime":"2025-09-30T17:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.918358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.918415 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.918427 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.918446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:12 crc kubenswrapper[4772]: I0930 17:02:12.918466 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:12Z","lastTransitionTime":"2025-09-30T17:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.021262 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.021340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.021360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.021387 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.021406 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:13Z","lastTransitionTime":"2025-09-30T17:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.124796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.124869 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.124885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.124909 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.124924 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:13Z","lastTransitionTime":"2025-09-30T17:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.178826 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/0.log" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.181872 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0" exitCode=1 Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.181945 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.182735 4772 scope.go:117] "RemoveContainer" containerID="daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.198504 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.213986 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.227829 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.228376 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.228419 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.228432 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.228456 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.228469 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:13Z","lastTransitionTime":"2025-09-30T17:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.268266 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.288544 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.308511 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.329145 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.330984 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.331077 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.331093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.331115 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.331130 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:13Z","lastTransitionTime":"2025-09-30T17:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.344049 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.359106 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.382988 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:12Z\\\",\\\"message\\\":\\\" 6011 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:12.234017 6011 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:12.234132 6011 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:12.234150 6011 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:12.234182 6011 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:12.234197 6011 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:12.234239 6011 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:02:12.234341 6011 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:02:12.234429 6011 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:12.234441 6011 factory.go:656] Stopping watch factory\\\\nI0930 17:02:12.234449 6011 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 17:02:12.234461 6011 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:12.234474 6011 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:12.234486 6011 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:12.234497 6011 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:12.234509 6011 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.404028 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.422989 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.434238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.434286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.434298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.434316 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.434334 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:13Z","lastTransitionTime":"2025-09-30T17:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.441540 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.456116 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.536977 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.537025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.537035 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.537051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.537088 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:13Z","lastTransitionTime":"2025-09-30T17:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.640045 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.640133 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.640145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.640164 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.640181 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:13Z","lastTransitionTime":"2025-09-30T17:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.743008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.743049 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.743075 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.743096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.743108 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:13Z","lastTransitionTime":"2025-09-30T17:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.846815 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.846858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.846867 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.846884 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.846894 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:13Z","lastTransitionTime":"2025-09-30T17:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.897625 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.897737 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:13 crc kubenswrapper[4772]: E0930 17:02:13.897787 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.897820 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:13 crc kubenswrapper[4772]: E0930 17:02:13.897936 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:13 crc kubenswrapper[4772]: E0930 17:02:13.898197 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.950452 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.950514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.950533 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.950558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:13 crc kubenswrapper[4772]: I0930 17:02:13.950575 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:13Z","lastTransitionTime":"2025-09-30T17:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.053322 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.053379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.053394 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.053419 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.053434 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:14Z","lastTransitionTime":"2025-09-30T17:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.156446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.156502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.156516 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.156537 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.156549 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:14Z","lastTransitionTime":"2025-09-30T17:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.188003 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/0.log" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.191977 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.192588 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.205709 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.222165 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.236176 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.254596 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.259903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.259955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.259970 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.259991 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.260005 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:14Z","lastTransitionTime":"2025-09-30T17:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.263174 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd"] Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.263773 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.266802 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.266969 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.270951 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.285831 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.285871 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.285905 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfhvf\" (UniqueName: \"kubernetes.io/projected/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-kube-api-access-bfhvf\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.285937 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.295607 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:12Z\\\",\\\"message\\\":\\\" 6011 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:12.234017 6011 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:12.234132 6011 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:12.234150 6011 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:12.234182 6011 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:12.234197 6011 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:12.234239 6011 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:02:12.234341 6011 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:02:12.234429 6011 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:12.234441 6011 factory.go:656] Stopping watch factory\\\\nI0930 17:02:12.234449 6011 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 17:02:12.234461 6011 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:12.234474 6011 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:12.234486 6011 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:12.234497 6011 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:12.234509 6011 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.311286 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.326050 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.339533 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.353665 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.362628 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.362864 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.363032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.363238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.363386 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:14Z","lastTransitionTime":"2025-09-30T17:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.369859 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.388010 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.388463 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.388660 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfhvf\" (UniqueName: \"kubernetes.io/projected/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-kube-api-access-bfhvf\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.388869 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.389083 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.389302 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.392482 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.394947 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.410887 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfhvf\" (UniqueName: \"kubernetes.io/projected/63c1dd91-22dc-4f0e-aca4-1a609b6cdf03-kube-api-access-bfhvf\") pod \"ovnkube-control-plane-749d76644c-jm5rd\" (UID: \"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.412719 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.432114 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.455410 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.467142 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.467226 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.467250 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.467286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.467311 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:14Z","lastTransitionTime":"2025-09-30T17:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.471672 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.494159 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.510386 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.540953 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.557033 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.569662 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.569713 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.569723 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.569741 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.569754 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:14Z","lastTransitionTime":"2025-09-30T17:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.571372 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.579733 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.586437 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: W0930 17:02:14.597391 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63c1dd91_22dc_4f0e_aca4_1a609b6cdf03.slice/crio-4216ad741fddbfc52a92591dbfbf75d1b9d83b5de98107b9cc6c541b134ffd64 WatchSource:0}: Error finding container 4216ad741fddbfc52a92591dbfbf75d1b9d83b5de98107b9cc6c541b134ffd64: Status 404 returned error can't find the container with id 4216ad741fddbfc52a92591dbfbf75d1b9d83b5de98107b9cc6c541b134ffd64 Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.605520 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.623179 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.636407 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.657702 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:12Z\\\",\\\"message\\\":\\\" 6011 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:12.234017 6011 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:12.234132 6011 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:12.234150 6011 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:12.234182 6011 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:12.234197 6011 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:12.234239 6011 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:02:12.234341 6011 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:02:12.234429 6011 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:12.234441 6011 factory.go:656] Stopping watch factory\\\\nI0930 17:02:12.234449 6011 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 17:02:12.234461 6011 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:12.234474 6011 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:12.234486 6011 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:12.234497 6011 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:12.234509 6011 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.673401 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.673453 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.673469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.673488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.673499 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:14Z","lastTransitionTime":"2025-09-30T17:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.674911 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.689867 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.704146 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.776989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.777303 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.777370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.777400 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.777743 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:14Z","lastTransitionTime":"2025-09-30T17:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.881170 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.881243 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.881262 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.881291 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.881312 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:14Z","lastTransitionTime":"2025-09-30T17:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.985124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.985195 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.985216 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.985246 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:14 crc kubenswrapper[4772]: I0930 17:02:14.985270 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:14Z","lastTransitionTime":"2025-09-30T17:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.088702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.088757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.088771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.088793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.088809 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:15Z","lastTransitionTime":"2025-09-30T17:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.192299 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.192353 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.192370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.192397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.192414 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:15Z","lastTransitionTime":"2025-09-30T17:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.206430 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/1.log" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.207310 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/0.log" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.212312 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a" exitCode=1 Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.212441 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.212789 4772 scope.go:117] "RemoveContainer" containerID="daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.213037 4772 scope.go:117] "RemoveContainer" containerID="007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a" Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.213215 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.217010 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" event={"ID":"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03","Type":"ContainerStarted","Data":"4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.217134 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" event={"ID":"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03","Type":"ContainerStarted","Data":"218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.217156 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" event={"ID":"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03","Type":"ContainerStarted","Data":"4216ad741fddbfc52a92591dbfbf75d1b9d83b5de98107b9cc6c541b134ffd64"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.231435 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.256308 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.275488 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.296794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.296887 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.296913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.296945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.296964 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:15Z","lastTransitionTime":"2025-09-30T17:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.301115 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.321408 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.342678 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.365027 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.389614 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.400905 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wlgc4"] Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.401593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.401654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.401675 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.401767 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.401678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.401849 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.401871 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:15Z","lastTransitionTime":"2025-09-30T17:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.408441 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.426748 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.447643 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.476259 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.494846 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.498549 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h58x\" (UniqueName: \"kubernetes.io/projected/0f2541dd-c77d-4bc5-9771-6ac741731464-kube-api-access-8h58x\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.498675 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.504729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.504779 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.504796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.504820 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.504839 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:15Z","lastTransitionTime":"2025-09-30T17:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.522288 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:12Z\\\",\\\"message\\\":\\\" 6011 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:12.234017 6011 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:12.234132 6011 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:12.234150 6011 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:12.234182 6011 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:12.234197 6011 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:12.234239 6011 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:02:12.234341 6011 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:02:12.234429 6011 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:12.234441 6011 factory.go:656] Stopping watch factory\\\\nI0930 17:02:12.234449 6011 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 17:02:12.234461 6011 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:12.234474 6011 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:12.234486 6011 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:12.234497 6011 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:12.234509 6011 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"al\\\\nI0930 17:02:14.457111 6190 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 17:02:14.457135 6190 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:02:14.457152 6190 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:14.457137 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:14.457191 6190 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:02:14.457255 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:14.457312 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:14.457382 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:14.457409 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:14.457446 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:14.457381 6190 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:14.457499 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:14.457452 6190 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:14.457581 6190 factory.go:656] Stopping watch factory\\\\nI0930 17:02:14.457535 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:14.457644 6190 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:02:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.541896 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.560770 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.583099 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.600667 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.600809 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h58x\" (UniqueName: \"kubernetes.io/projected/0f2541dd-c77d-4bc5-9771-6ac741731464-kube-api-access-8h58x\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.600982 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.601160 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs podName:0f2541dd-c77d-4bc5-9771-6ac741731464 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:16.101118951 +0000 UTC m=+37.008132002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs") pod "network-metrics-daemon-wlgc4" (UID: "0f2541dd-c77d-4bc5-9771-6ac741731464") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.608576 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.609275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.609337 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.609366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.609400 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.609425 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:15Z","lastTransitionTime":"2025-09-30T17:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.632680 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.638437 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h58x\" (UniqueName: \"kubernetes.io/projected/0f2541dd-c77d-4bc5-9771-6ac741731464-kube-api-access-8h58x\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.653353 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.675113 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.701102 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.702099 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702305 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:02:31.702269012 +0000 UTC m=+52.609281863 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.702375 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.702510 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.702569 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702596 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702636 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702661 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702708 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702740 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:31.702714204 +0000 UTC m=+52.609727065 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.702610 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702801 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702847 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702851 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:31.702763605 +0000 UTC m=+52.609776466 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702872 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.702952 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:31.702929099 +0000 UTC m=+52.609941970 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.703105 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.703184 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:31.703164686 +0000 UTC m=+52.610177557 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.713042 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.713143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.713161 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.713189 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.713224 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:15Z","lastTransitionTime":"2025-09-30T17:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.717927 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.738843 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.767049 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.788172 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.808532 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.817303 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.817365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.817383 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.817417 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.817435 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:15Z","lastTransitionTime":"2025-09-30T17:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.841297 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daf7a6dfeadc2eb4e393dcc88ff4e83d03d1f8de571f1d5d6f8ca51804dcc7c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:12Z\\\",\\\"message\\\":\\\" 6011 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:12.234017 6011 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:12.234132 6011 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:12.234150 6011 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:12.234182 6011 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:12.234197 6011 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:12.234239 6011 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:02:12.234341 6011 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:02:12.234429 6011 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:12.234441 6011 factory.go:656] Stopping watch factory\\\\nI0930 17:02:12.234449 6011 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 17:02:12.234461 6011 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:12.234474 6011 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:12.234486 6011 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:12.234497 6011 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:12.234509 6011 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"al\\\\nI0930 17:02:14.457111 6190 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 17:02:14.457135 6190 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:02:14.457152 6190 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:14.457137 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:14.457191 6190 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:02:14.457255 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:14.457312 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:14.457382 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:14.457409 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:14.457446 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:14.457381 6190 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:14.457499 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:14.457452 6190 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:14.457581 6190 factory.go:656] Stopping watch factory\\\\nI0930 17:02:14.457535 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:14.457644 6190 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:02:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.858456 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.877751 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.893153 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.897577 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.897612 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.897814 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.897898 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.898032 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:15 crc kubenswrapper[4772]: E0930 17:02:15.898145 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.921381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.921442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.921468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.921502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:15 crc kubenswrapper[4772]: I0930 17:02:15.921531 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:15Z","lastTransitionTime":"2025-09-30T17:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.025480 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.025575 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.025594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.025622 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.025644 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.107998 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:16 crc kubenswrapper[4772]: E0930 17:02:16.108356 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:16 crc kubenswrapper[4772]: E0930 17:02:16.108547 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs podName:0f2541dd-c77d-4bc5-9771-6ac741731464 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:17.108499779 +0000 UTC m=+38.015512820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs") pod "network-metrics-daemon-wlgc4" (UID: "0f2541dd-c77d-4bc5-9771-6ac741731464") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.129031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.129163 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.129188 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.129228 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.129254 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.223883 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/1.log" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.228772 4772 scope.go:117] "RemoveContainer" containerID="007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a" Sep 30 17:02:16 crc kubenswrapper[4772]: E0930 17:02:16.228933 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.231646 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.231674 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.231685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.231699 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.231710 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.250312 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.267427 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.294816 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.316273 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.334731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.334796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.334814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.334843 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.334864 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.337379 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.355118 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.377440 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.381289 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.381344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.381360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.381381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.381396 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.396605 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: E0930 17:02:16.404342 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.411202 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.411279 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.411305 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.411333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.411351 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.419208 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: E0930 17:02:16.433955 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.440636 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.440704 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.440722 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.440752 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.440770 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.443932 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: E0930 17:02:16.461808 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.463487 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.467375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.467661 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.467926 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.468145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.468298 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.485016 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: E0930 17:02:16.489687 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.495150 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.495231 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.495257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.495296 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.495321 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.513973 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"al\\\\nI0930 17:02:14.457111 6190 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 17:02:14.457135 6190 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:02:14.457152 6190 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:14.457137 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:14.457191 6190 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:02:14.457255 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:14.457312 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:14.457382 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:14.457409 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:14.457446 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:14.457381 6190 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:14.457499 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:14.457452 6190 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:14.457581 6190 factory.go:656] Stopping watch factory\\\\nI0930 17:02:14.457535 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:14.457644 6190 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:02:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: E0930 17:02:16.516281 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: E0930 17:02:16.516407 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.518670 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.518702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.518711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.518728 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.518740 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.532199 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.549715 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.567549 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.621381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.621437 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.621450 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.621470 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.621483 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.725221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.725286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.725301 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.725325 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.725342 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.829441 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.829511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.829520 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.829552 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.829570 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.897484 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:16 crc kubenswrapper[4772]: E0930 17:02:16.897699 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.932719 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.932773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.932786 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.932805 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:16 crc kubenswrapper[4772]: I0930 17:02:16.932820 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:16Z","lastTransitionTime":"2025-09-30T17:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.036141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.036248 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.036267 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.036299 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.036321 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:17Z","lastTransitionTime":"2025-09-30T17:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.120796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:17 crc kubenswrapper[4772]: E0930 17:02:17.121149 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:17 crc kubenswrapper[4772]: E0930 17:02:17.121286 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs podName:0f2541dd-c77d-4bc5-9771-6ac741731464 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:19.121253203 +0000 UTC m=+40.028266044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs") pod "network-metrics-daemon-wlgc4" (UID: "0f2541dd-c77d-4bc5-9771-6ac741731464") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.140272 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.140349 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.140370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.140399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.140418 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:17Z","lastTransitionTime":"2025-09-30T17:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.243602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.243690 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.243715 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.243744 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.243766 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:17Z","lastTransitionTime":"2025-09-30T17:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.347585 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.347649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.347667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.347693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.347712 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:17Z","lastTransitionTime":"2025-09-30T17:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.451169 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.451237 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.451257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.451287 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.451305 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:17Z","lastTransitionTime":"2025-09-30T17:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.554989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.555129 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.555149 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.555182 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.555202 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:17Z","lastTransitionTime":"2025-09-30T17:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.659874 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.659956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.659974 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.660002 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.660024 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:17Z","lastTransitionTime":"2025-09-30T17:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.764202 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.764283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.764307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.764339 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.764364 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:17Z","lastTransitionTime":"2025-09-30T17:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.868851 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.868914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.868925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.868945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.868957 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:17Z","lastTransitionTime":"2025-09-30T17:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.898286 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.898392 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.898323 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:17 crc kubenswrapper[4772]: E0930 17:02:17.898526 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:17 crc kubenswrapper[4772]: E0930 17:02:17.898855 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:17 crc kubenswrapper[4772]: E0930 17:02:17.899161 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.972494 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.972559 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.972580 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.972609 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:17 crc kubenswrapper[4772]: I0930 17:02:17.972631 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:17Z","lastTransitionTime":"2025-09-30T17:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.075804 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.075856 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.075867 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.075888 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.075902 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:18Z","lastTransitionTime":"2025-09-30T17:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.179409 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.179506 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.179525 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.179552 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.179572 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:18Z","lastTransitionTime":"2025-09-30T17:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.282158 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.282209 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.282224 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.282246 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.282260 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:18Z","lastTransitionTime":"2025-09-30T17:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.384880 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.384958 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.384993 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.385032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.385086 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:18Z","lastTransitionTime":"2025-09-30T17:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.487655 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.487691 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.487699 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.487712 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.487723 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:18Z","lastTransitionTime":"2025-09-30T17:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.591355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.591431 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.591449 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.591478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.591497 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:18Z","lastTransitionTime":"2025-09-30T17:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.695576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.695657 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.695676 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.695707 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.695727 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:18Z","lastTransitionTime":"2025-09-30T17:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.799098 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.799155 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.799165 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.799180 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.799190 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:18Z","lastTransitionTime":"2025-09-30T17:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.897560 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:18 crc kubenswrapper[4772]: E0930 17:02:18.897773 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.902328 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.902431 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.902453 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.902485 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:18 crc kubenswrapper[4772]: I0930 17:02:18.902506 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:18Z","lastTransitionTime":"2025-09-30T17:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.006408 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.006489 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.006510 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.006536 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.006554 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:19Z","lastTransitionTime":"2025-09-30T17:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.114641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.114729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.114753 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.114784 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.114809 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:19Z","lastTransitionTime":"2025-09-30T17:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.147033 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:19 crc kubenswrapper[4772]: E0930 17:02:19.147293 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:19 crc kubenswrapper[4772]: E0930 17:02:19.147406 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs podName:0f2541dd-c77d-4bc5-9771-6ac741731464 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:23.147377228 +0000 UTC m=+44.054390069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs") pod "network-metrics-daemon-wlgc4" (UID: "0f2541dd-c77d-4bc5-9771-6ac741731464") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.217938 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.218008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.218025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.218054 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.218102 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:19Z","lastTransitionTime":"2025-09-30T17:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.321971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.322093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.322112 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.322141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.322161 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:19Z","lastTransitionTime":"2025-09-30T17:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.425359 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.425426 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.425445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.425475 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.425494 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:19Z","lastTransitionTime":"2025-09-30T17:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.528015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.528104 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.528141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.528165 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.528184 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:19Z","lastTransitionTime":"2025-09-30T17:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.630576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.630623 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.630641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.630663 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.630680 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:19Z","lastTransitionTime":"2025-09-30T17:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.732632 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.732701 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.732720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.732745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.732761 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:19Z","lastTransitionTime":"2025-09-30T17:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.835081 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.835376 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.835385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.835398 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.835408 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:19Z","lastTransitionTime":"2025-09-30T17:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.897550 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.897599 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:19 crc kubenswrapper[4772]: E0930 17:02:19.897699 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.897708 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:19 crc kubenswrapper[4772]: E0930 17:02:19.898002 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:19 crc kubenswrapper[4772]: E0930 17:02:19.897964 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.918291 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.935867 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.938320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.938366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.938386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.938410 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.938424 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:19Z","lastTransitionTime":"2025-09-30T17:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.959849 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.971576 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:19 crc kubenswrapper[4772]: I0930 17:02:19.984870 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.000973 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.022620 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.037316 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.041296 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.041332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.041343 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.041361 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.041373 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:20Z","lastTransitionTime":"2025-09-30T17:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.058735 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"al\\\\nI0930 17:02:14.457111 6190 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 17:02:14.457135 6190 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:02:14.457152 6190 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:14.457137 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:14.457191 6190 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:02:14.457255 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:14.457312 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:14.457382 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:14.457409 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:14.457446 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:14.457381 6190 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:14.457499 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:14.457452 6190 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:14.457581 6190 factory.go:656] Stopping watch factory\\\\nI0930 17:02:14.457535 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:14.457644 6190 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:02:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.069883 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.083468 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.097460 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.111165 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.124124 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.144509 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.144591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.145083 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.145100 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.145120 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.145135 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:20Z","lastTransitionTime":"2025-09-30T17:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.159553 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.248405 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.248454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.248464 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.248484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.248495 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:20Z","lastTransitionTime":"2025-09-30T17:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.351957 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.352016 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.352031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.352080 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.352097 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:20Z","lastTransitionTime":"2025-09-30T17:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.455730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.455780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.455792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.455813 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.455824 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:20Z","lastTransitionTime":"2025-09-30T17:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.559215 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.559271 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.559292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.559318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.559336 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:20Z","lastTransitionTime":"2025-09-30T17:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.662646 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.662702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.662717 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.662737 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.662754 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:20Z","lastTransitionTime":"2025-09-30T17:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.765809 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.765854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.765870 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.765888 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.765900 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:20Z","lastTransitionTime":"2025-09-30T17:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.868264 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.868320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.868339 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.868371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.868389 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:20Z","lastTransitionTime":"2025-09-30T17:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.897550 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:20 crc kubenswrapper[4772]: E0930 17:02:20.897741 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.971490 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.971590 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.971615 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.971651 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:20 crc kubenswrapper[4772]: I0930 17:02:20.971676 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:20Z","lastTransitionTime":"2025-09-30T17:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.074213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.074257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.074268 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.074286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.074298 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:21Z","lastTransitionTime":"2025-09-30T17:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.178204 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.178256 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.178271 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.178296 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.178311 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:21Z","lastTransitionTime":"2025-09-30T17:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.280435 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.280468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.280479 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.280497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.280507 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:21Z","lastTransitionTime":"2025-09-30T17:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.383243 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.383276 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.383284 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.383299 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.383310 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:21Z","lastTransitionTime":"2025-09-30T17:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.485686 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.485759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.485769 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.485788 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.485805 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:21Z","lastTransitionTime":"2025-09-30T17:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.588274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.588307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.588314 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.588328 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.588344 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:21Z","lastTransitionTime":"2025-09-30T17:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.690361 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.690410 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.690423 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.690441 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.690462 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:21Z","lastTransitionTime":"2025-09-30T17:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.792832 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.792882 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.792894 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.792915 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.792931 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:21Z","lastTransitionTime":"2025-09-30T17:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.895924 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.895977 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.895992 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.896010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.896025 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:21Z","lastTransitionTime":"2025-09-30T17:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.897190 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.897233 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.897238 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:21 crc kubenswrapper[4772]: E0930 17:02:21.897304 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:21 crc kubenswrapper[4772]: E0930 17:02:21.897470 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:21 crc kubenswrapper[4772]: E0930 17:02:21.897559 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.998496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.998558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.998567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.998588 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:21 crc kubenswrapper[4772]: I0930 17:02:21.998598 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:21Z","lastTransitionTime":"2025-09-30T17:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.101048 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.101093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.101101 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.101114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.101122 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:22Z","lastTransitionTime":"2025-09-30T17:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.204202 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.204247 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.204256 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.204274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.204284 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:22Z","lastTransitionTime":"2025-09-30T17:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.307179 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.307251 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.307274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.307307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.307330 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:22Z","lastTransitionTime":"2025-09-30T17:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.410239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.410276 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.410285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.410301 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.410311 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:22Z","lastTransitionTime":"2025-09-30T17:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.513025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.513087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.513098 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.513117 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.513129 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:22Z","lastTransitionTime":"2025-09-30T17:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.616207 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.616275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.616298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.616326 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.616349 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:22Z","lastTransitionTime":"2025-09-30T17:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.719621 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.719696 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.719710 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.719730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.719742 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:22Z","lastTransitionTime":"2025-09-30T17:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.822306 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.822376 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.822395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.822419 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.822439 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:22Z","lastTransitionTime":"2025-09-30T17:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.897361 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:22 crc kubenswrapper[4772]: E0930 17:02:22.897570 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.925566 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.925634 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.925653 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.925677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:22 crc kubenswrapper[4772]: I0930 17:02:22.925692 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:22Z","lastTransitionTime":"2025-09-30T17:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.028816 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.028881 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.028903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.028933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.028956 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:23Z","lastTransitionTime":"2025-09-30T17:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.131860 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.131898 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.131914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.131934 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.131948 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:23Z","lastTransitionTime":"2025-09-30T17:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.191263 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:23 crc kubenswrapper[4772]: E0930 17:02:23.191559 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:23 crc kubenswrapper[4772]: E0930 17:02:23.191695 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs podName:0f2541dd-c77d-4bc5-9771-6ac741731464 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:31.191663361 +0000 UTC m=+52.098676242 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs") pod "network-metrics-daemon-wlgc4" (UID: "0f2541dd-c77d-4bc5-9771-6ac741731464") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.235301 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.235344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.235356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.235378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.235393 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:23Z","lastTransitionTime":"2025-09-30T17:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.338947 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.338995 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.339014 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.339035 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.339049 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:23Z","lastTransitionTime":"2025-09-30T17:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.442180 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.442229 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.442242 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.442259 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.442271 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:23Z","lastTransitionTime":"2025-09-30T17:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.544698 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.544740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.544751 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.544770 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.544781 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:23Z","lastTransitionTime":"2025-09-30T17:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.647593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.647641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.647651 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.647671 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.647683 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:23Z","lastTransitionTime":"2025-09-30T17:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.750662 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.750702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.750713 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.750731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.750744 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:23Z","lastTransitionTime":"2025-09-30T17:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.853986 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.854035 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.854050 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.854109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.854126 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:23Z","lastTransitionTime":"2025-09-30T17:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.898133 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.898177 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:23 crc kubenswrapper[4772]: E0930 17:02:23.898296 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.898354 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:23 crc kubenswrapper[4772]: E0930 17:02:23.898516 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:23 crc kubenswrapper[4772]: E0930 17:02:23.898640 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.956779 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.956827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.956839 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.956855 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:23 crc kubenswrapper[4772]: I0930 17:02:23.956867 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:23Z","lastTransitionTime":"2025-09-30T17:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.061509 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.061600 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.061618 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.062201 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.062239 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:24Z","lastTransitionTime":"2025-09-30T17:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.164545 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.164591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.164605 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.164624 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.164636 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:24Z","lastTransitionTime":"2025-09-30T17:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.266417 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.266468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.266478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.266493 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.266503 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:24Z","lastTransitionTime":"2025-09-30T17:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.369490 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.369586 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.369611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.369645 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.369668 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:24Z","lastTransitionTime":"2025-09-30T17:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.472655 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.472709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.472721 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.472741 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.472753 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:24Z","lastTransitionTime":"2025-09-30T17:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.575096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.575126 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.575135 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.575149 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.575158 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:24Z","lastTransitionTime":"2025-09-30T17:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.678380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.678427 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.678438 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.678459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.678472 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:24Z","lastTransitionTime":"2025-09-30T17:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.781424 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.781496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.781508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.781530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.781544 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:24Z","lastTransitionTime":"2025-09-30T17:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.885933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.886003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.886022 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.886075 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.886095 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:24Z","lastTransitionTime":"2025-09-30T17:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.897234 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:24 crc kubenswrapper[4772]: E0930 17:02:24.897461 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.989593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.989643 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.989654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.989676 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:24 crc kubenswrapper[4772]: I0930 17:02:24.989690 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:24Z","lastTransitionTime":"2025-09-30T17:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.094313 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.094374 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.094383 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.094408 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.094420 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:25Z","lastTransitionTime":"2025-09-30T17:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.196878 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.196914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.196923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.196937 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.196948 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:25Z","lastTransitionTime":"2025-09-30T17:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.301381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.301454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.301474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.301503 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.301523 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:25Z","lastTransitionTime":"2025-09-30T17:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.406147 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.406230 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.406248 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.406278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.406299 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:25Z","lastTransitionTime":"2025-09-30T17:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.509668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.509732 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.509749 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.509769 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.509784 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:25Z","lastTransitionTime":"2025-09-30T17:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.614307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.614383 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.614393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.614409 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.614418 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:25Z","lastTransitionTime":"2025-09-30T17:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.716974 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.717010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.717017 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.717030 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.717040 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:25Z","lastTransitionTime":"2025-09-30T17:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.820808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.820898 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.820934 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.820970 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.820993 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:25Z","lastTransitionTime":"2025-09-30T17:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.898207 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.898284 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.898340 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:25 crc kubenswrapper[4772]: E0930 17:02:25.898378 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:25 crc kubenswrapper[4772]: E0930 17:02:25.898446 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:25 crc kubenswrapper[4772]: E0930 17:02:25.898537 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.923523 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.923572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.923584 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.923602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:25 crc kubenswrapper[4772]: I0930 17:02:25.923620 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:25Z","lastTransitionTime":"2025-09-30T17:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.027445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.027641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.027666 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.027739 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.027771 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.131919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.131963 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.131978 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.131995 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.132006 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.236160 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.236228 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.236238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.236254 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.236265 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.338338 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.338397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.338411 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.338429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.338441 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.445673 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.445767 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.445794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.445833 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.445857 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.548231 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.548285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.548302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.548325 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.548342 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.575087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.575212 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.575229 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.575254 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.575267 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: E0930 17:02:26.589709 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:26Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.593595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.593643 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.593659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.593678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.593694 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: E0930 17:02:26.609174 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:26Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.613431 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.613479 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.613488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.613505 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.613516 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: E0930 17:02:26.627226 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:26Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.631530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.631616 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.631629 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.631651 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.631664 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: E0930 17:02:26.645559 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:26Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.651021 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.651076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.651085 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.651100 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.651112 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: E0930 17:02:26.664284 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:26Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:26 crc kubenswrapper[4772]: E0930 17:02:26.664392 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.666485 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.666511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.666519 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.666530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.666539 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.772271 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.772321 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.772333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.772349 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.772360 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.875820 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.875874 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.875890 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.875912 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.875926 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.897238 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:26 crc kubenswrapper[4772]: E0930 17:02:26.897406 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.979230 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.979303 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.979316 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.979358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:26 crc kubenswrapper[4772]: I0930 17:02:26.979372 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:26Z","lastTransitionTime":"2025-09-30T17:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.082995 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.083139 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.083244 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.083275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.083338 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:27Z","lastTransitionTime":"2025-09-30T17:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.186534 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.186605 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.186618 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.186639 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.186653 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:27Z","lastTransitionTime":"2025-09-30T17:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.290567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.290651 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.290668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.290702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.290722 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:27Z","lastTransitionTime":"2025-09-30T17:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.393164 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.393210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.393221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.393238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.393253 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:27Z","lastTransitionTime":"2025-09-30T17:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.496557 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.496611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.496623 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.496644 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.496657 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:27Z","lastTransitionTime":"2025-09-30T17:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.599345 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.599889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.600113 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.600307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.600447 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:27Z","lastTransitionTime":"2025-09-30T17:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.703261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.703329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.703344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.703363 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.703375 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:27Z","lastTransitionTime":"2025-09-30T17:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.806589 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.806661 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.806680 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.806709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.806735 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:27Z","lastTransitionTime":"2025-09-30T17:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.898152 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.898264 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.898264 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:27 crc kubenswrapper[4772]: E0930 17:02:27.898360 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:27 crc kubenswrapper[4772]: E0930 17:02:27.898473 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:27 crc kubenswrapper[4772]: E0930 17:02:27.898587 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.900271 4772 scope.go:117] "RemoveContainer" containerID="007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.909099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.909157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.909176 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.909208 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:27 crc kubenswrapper[4772]: I0930 17:02:27.909229 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:27Z","lastTransitionTime":"2025-09-30T17:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.013135 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.013183 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.013195 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.013211 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.013223 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:28Z","lastTransitionTime":"2025-09-30T17:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.116403 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.116466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.116482 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.116506 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.116525 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:28Z","lastTransitionTime":"2025-09-30T17:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.220001 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.220096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.220114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.220137 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.220156 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:28Z","lastTransitionTime":"2025-09-30T17:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.280317 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/1.log" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.282900 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.283994 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.302564 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.316949 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.322838 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.322920 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.322972 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.322996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.323013 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:28Z","lastTransitionTime":"2025-09-30T17:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.338959 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.356905 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.376540 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.395503 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.411571 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.425384 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.425796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.425832 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.425844 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.425861 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.425871 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:28Z","lastTransitionTime":"2025-09-30T17:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.448171 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"al\\\\nI0930 17:02:14.457111 6190 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 17:02:14.457135 6190 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:02:14.457152 6190 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:14.457137 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:14.457191 6190 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:02:14.457255 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:14.457312 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:14.457382 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:14.457409 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:14.457446 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:14.457381 6190 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:14.457499 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:14.457452 6190 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:14.457581 6190 factory.go:656] Stopping watch factory\\\\nI0930 17:02:14.457535 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:14.457644 6190 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:02:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.464393 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.477285 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.491100 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.509743 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.525918 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.528536 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.528595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.528610 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.528630 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.528643 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:28Z","lastTransitionTime":"2025-09-30T17:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.542032 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.556007 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.632015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.632078 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.632089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.632104 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.632116 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:28Z","lastTransitionTime":"2025-09-30T17:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.735133 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.735213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.735299 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.735320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.735331 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:28Z","lastTransitionTime":"2025-09-30T17:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.837976 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.838051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.838086 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.838110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.838125 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:28Z","lastTransitionTime":"2025-09-30T17:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.897706 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:28 crc kubenswrapper[4772]: E0930 17:02:28.897931 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.941172 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.941238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.941253 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.941320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:28 crc kubenswrapper[4772]: I0930 17:02:28.941340 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:28Z","lastTransitionTime":"2025-09-30T17:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.044475 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.044521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.044536 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.044570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.044590 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:29Z","lastTransitionTime":"2025-09-30T17:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.147350 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.147403 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.147439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.147458 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.147471 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:29Z","lastTransitionTime":"2025-09-30T17:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.252040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.252145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.252163 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.252186 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.252205 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:29Z","lastTransitionTime":"2025-09-30T17:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.289583 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/2.log" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.290412 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/1.log" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.295054 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367" exitCode=1 Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.295148 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.295245 4772 scope.go:117] "RemoveContainer" containerID="007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.296574 4772 scope.go:117] "RemoveContainer" containerID="7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367" Sep 30 17:02:29 crc kubenswrapper[4772]: E0930 17:02:29.296862 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.313663 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.329439 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.345539 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.354414 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.354461 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.354483 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.354511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.354532 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:29Z","lastTransitionTime":"2025-09-30T17:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.361920 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.378300 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.393780 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.406936 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.425522 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.444523 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.457295 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.457575 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.457677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.457771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.457857 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:29Z","lastTransitionTime":"2025-09-30T17:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.463363 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.478372 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.493699 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.509883 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.523455 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.536665 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.559176 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"al\\\\nI0930 17:02:14.457111 6190 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 17:02:14.457135 6190 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:02:14.457152 6190 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:14.457137 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:14.457191 6190 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:02:14.457255 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:14.457312 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:14.457382 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:14.457409 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:14.457446 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:14.457381 6190 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:14.457499 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:14.457452 6190 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:14.457581 6190 factory.go:656] Stopping watch factory\\\\nI0930 17:02:14.457535 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:14.457644 6190 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:02:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:29Z\\\",\\\"message\\\":\\\"g{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 17:02:29.053493 6390 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:02:29.053534 6390 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Pos\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.560712 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.560740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.560752 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.560768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.560781 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:29Z","lastTransitionTime":"2025-09-30T17:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.662951 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.663006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.663015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.663033 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.663070 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:29Z","lastTransitionTime":"2025-09-30T17:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.766008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.766085 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.766096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.766116 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.766127 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:29Z","lastTransitionTime":"2025-09-30T17:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.869195 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.869286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.869307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.869340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.869362 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:29Z","lastTransitionTime":"2025-09-30T17:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.897787 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.897915 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:29 crc kubenswrapper[4772]: E0930 17:02:29.897989 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:29 crc kubenswrapper[4772]: E0930 17:02:29.898134 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.898483 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:29 crc kubenswrapper[4772]: E0930 17:02:29.898802 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.916829 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.940773 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.966280 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.972248 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.972428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.972554 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.972666 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.972753 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:29Z","lastTransitionTime":"2025-09-30T17:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:29 crc kubenswrapper[4772]: I0930 17:02:29.985161 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.002480 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:29Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.024910 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.043427 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.056850 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.075380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.075420 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.075432 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.075448 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.075460 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:30Z","lastTransitionTime":"2025-09-30T17:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.080164 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://007940e5aa2ac7ddd9229279608b9282f951e78b7924fbd9f586a57f138a466a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"message\\\":\\\"al\\\\nI0930 17:02:14.457111 6190 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 17:02:14.457135 6190 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:02:14.457152 6190 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:02:14.457137 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:02:14.457191 6190 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:02:14.457255 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:02:14.457312 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:02:14.457382 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:02:14.457409 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:02:14.457446 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:02:14.457381 6190 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:02:14.457499 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:02:14.457452 6190 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 17:02:14.457581 6190 factory.go:656] Stopping watch factory\\\\nI0930 17:02:14.457535 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:02:14.457644 6190 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:02:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:29Z\\\",\\\"message\\\":\\\"g{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 17:02:29.053493 6390 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:02:29.053534 6390 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Pos\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.092556 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.112262 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.130793 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.143297 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.160944 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.177345 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.177373 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.177406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.177421 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.177431 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:30Z","lastTransitionTime":"2025-09-30T17:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.189793 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.211366 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.279975 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.280025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.280034 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.280048 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.280080 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:30Z","lastTransitionTime":"2025-09-30T17:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.300411 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/2.log" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.303854 4772 scope.go:117] "RemoveContainer" containerID="7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367" Sep 30 17:02:30 crc kubenswrapper[4772]: E0930 17:02:30.303989 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.320127 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.335351 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.360022 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.373480 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.383428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.383479 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.383487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.383503 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.383517 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:30Z","lastTransitionTime":"2025-09-30T17:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.393647 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.409684 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.424258 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.440942 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.461507 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:29Z\\\",\\\"message\\\":\\\"g{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 17:02:29.053493 6390 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:02:29.053534 6390 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Pos\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.475434 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.490025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.490099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.490124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.490148 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.490165 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:30Z","lastTransitionTime":"2025-09-30T17:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.494877 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.510576 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.525827 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.543638 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.560176 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.579211 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.593477 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.593537 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.593549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.593572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.593585 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:30Z","lastTransitionTime":"2025-09-30T17:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.696745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.696794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.696807 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.696822 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.696834 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:30Z","lastTransitionTime":"2025-09-30T17:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.799640 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.799705 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.799723 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.799744 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.799760 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:30Z","lastTransitionTime":"2025-09-30T17:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.897933 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:30 crc kubenswrapper[4772]: E0930 17:02:30.898510 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.904096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.904162 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.904179 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.904205 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:30 crc kubenswrapper[4772]: I0930 17:02:30.904224 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:30Z","lastTransitionTime":"2025-09-30T17:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.007169 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.007221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.007237 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.007257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.007273 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:31Z","lastTransitionTime":"2025-09-30T17:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.110434 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.110499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.110516 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.110543 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.110561 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:31Z","lastTransitionTime":"2025-09-30T17:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.213980 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.214032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.214042 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.214076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.214087 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:31Z","lastTransitionTime":"2025-09-30T17:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.280349 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.280481 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs podName:0f2541dd-c77d-4bc5-9771-6ac741731464 nodeName:}" failed. No retries permitted until 2025-09-30 17:02:47.28045458 +0000 UTC m=+68.187467411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs") pod "network-metrics-daemon-wlgc4" (UID: "0f2541dd-c77d-4bc5-9771-6ac741731464") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.280135 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.317437 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.317496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.317504 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.317518 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.317529 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:31Z","lastTransitionTime":"2025-09-30T17:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.420908 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.420963 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.420974 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.420996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.421009 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:31Z","lastTransitionTime":"2025-09-30T17:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.524821 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.524879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.524889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.524913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.524926 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:31Z","lastTransitionTime":"2025-09-30T17:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.628392 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.628462 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.628481 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.628511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.628531 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:31Z","lastTransitionTime":"2025-09-30T17:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.731487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.731567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.731583 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.731605 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.731617 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:31Z","lastTransitionTime":"2025-09-30T17:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.787290 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.787447 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.787495 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.787749 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:03:03.78752476 +0000 UTC m=+84.694537611 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.787833 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.787857 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.787947 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.787957 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.787966 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.787971 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.788106 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:03:03.788036844 +0000 UTC m=+84.695049745 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.788154 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.788165 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.788279 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.788300 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.788169 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:03:03.788142576 +0000 UTC m=+84.695155477 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.788440 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:03:03.788391153 +0000 UTC m=+84.695403984 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.788463 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:03:03.788454404 +0000 UTC m=+84.695467235 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.834757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.834837 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.834852 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.834869 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.834882 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:31Z","lastTransitionTime":"2025-09-30T17:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.897238 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.897401 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.897426 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.897684 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.897843 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:31 crc kubenswrapper[4772]: E0930 17:02:31.898170 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.937791 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.937857 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.937875 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.937903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:31 crc kubenswrapper[4772]: I0930 17:02:31.937922 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:31Z","lastTransitionTime":"2025-09-30T17:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.041488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.041558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.041586 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.041621 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.041647 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:32Z","lastTransitionTime":"2025-09-30T17:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.145246 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.145315 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.145342 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.145374 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.145399 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:32Z","lastTransitionTime":"2025-09-30T17:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.248279 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.248366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.248395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.248428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.248450 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:32Z","lastTransitionTime":"2025-09-30T17:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.351683 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.351731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.351742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.351760 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.351775 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:32Z","lastTransitionTime":"2025-09-30T17:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.457233 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.457293 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.457309 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.457329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.457344 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:32Z","lastTransitionTime":"2025-09-30T17:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.559656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.559687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.559697 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.559712 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.559722 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:32Z","lastTransitionTime":"2025-09-30T17:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.662272 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.662345 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.662429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.662459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.662476 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:32Z","lastTransitionTime":"2025-09-30T17:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.765439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.765505 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.765523 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.765548 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.765578 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:32Z","lastTransitionTime":"2025-09-30T17:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.868949 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.868999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.869016 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.869037 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.869053 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:32Z","lastTransitionTime":"2025-09-30T17:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.897585 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:32 crc kubenswrapper[4772]: E0930 17:02:32.897805 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.972196 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.972283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.972302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.972355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:32 crc kubenswrapper[4772]: I0930 17:02:32.972375 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:32Z","lastTransitionTime":"2025-09-30T17:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.075860 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.075932 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.075949 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.075971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.075986 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:33Z","lastTransitionTime":"2025-09-30T17:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.179232 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.179318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.179338 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.179367 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.179387 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:33Z","lastTransitionTime":"2025-09-30T17:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.281242 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.281572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.281669 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.281762 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.281841 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:33Z","lastTransitionTime":"2025-09-30T17:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.384703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.384971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.385111 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.385222 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.385323 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:33Z","lastTransitionTime":"2025-09-30T17:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.487279 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.487695 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.487846 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.488007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.488156 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:33Z","lastTransitionTime":"2025-09-30T17:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.590875 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.590929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.590940 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.590955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.590966 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:33Z","lastTransitionTime":"2025-09-30T17:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.694039 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.694102 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.694118 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.694133 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.694143 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:33Z","lastTransitionTime":"2025-09-30T17:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.797193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.797249 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.797260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.797279 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.797293 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:33Z","lastTransitionTime":"2025-09-30T17:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.897946 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.898223 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:33 crc kubenswrapper[4772]: E0930 17:02:33.898351 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.898404 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:33 crc kubenswrapper[4772]: E0930 17:02:33.898584 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:33 crc kubenswrapper[4772]: E0930 17:02:33.898699 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.900656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.900711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.900731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.900763 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:33 crc kubenswrapper[4772]: I0930 17:02:33.900786 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:33Z","lastTransitionTime":"2025-09-30T17:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.003956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.004012 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.004023 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.004039 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.004050 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:34Z","lastTransitionTime":"2025-09-30T17:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.042901 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.058615 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.066224 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.088411 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.105686 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.107740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.107794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.107811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.107835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.107872 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:34Z","lastTransitionTime":"2025-09-30T17:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.125052 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.155283 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:29Z\\\",\\\"message\\\":\\\"g{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 17:02:29.053493 6390 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:02:29.053534 6390 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Pos\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.172452 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.188354 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.201852 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.210566 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.210624 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.210641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.210663 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.210680 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:34Z","lastTransitionTime":"2025-09-30T17:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.217012 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.235490 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.253019 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.269164 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.284948 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.297547 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.312581 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.312622 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.312636 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.312651 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.312662 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:34Z","lastTransitionTime":"2025-09-30T17:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.316155 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.331591 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:34Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.415046 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.415107 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.415115 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.415132 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.415143 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:34Z","lastTransitionTime":"2025-09-30T17:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.517713 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.517747 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.517755 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.517769 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.517779 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:34Z","lastTransitionTime":"2025-09-30T17:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.620734 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.620782 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.620790 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.620804 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.620814 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:34Z","lastTransitionTime":"2025-09-30T17:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.723594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.723647 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.723664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.723683 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.723696 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:34Z","lastTransitionTime":"2025-09-30T17:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.826095 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.826147 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.826161 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.826177 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.826193 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:34Z","lastTransitionTime":"2025-09-30T17:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.897146 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:34 crc kubenswrapper[4772]: E0930 17:02:34.897316 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.929193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.929234 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.929253 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.929272 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:34 crc kubenswrapper[4772]: I0930 17:02:34.929283 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:34Z","lastTransitionTime":"2025-09-30T17:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.032099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.032158 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.032171 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.032190 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.032201 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:35Z","lastTransitionTime":"2025-09-30T17:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.135163 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.135224 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.135234 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.135251 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.135263 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:35Z","lastTransitionTime":"2025-09-30T17:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.238784 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.238843 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.238859 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.238879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.238896 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:35Z","lastTransitionTime":"2025-09-30T17:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.342489 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.342583 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.342614 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.342648 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.342671 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:35Z","lastTransitionTime":"2025-09-30T17:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.446773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.446837 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.446855 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.446880 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.446899 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:35Z","lastTransitionTime":"2025-09-30T17:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.550235 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.550313 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.550337 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.550371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.550396 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:35Z","lastTransitionTime":"2025-09-30T17:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.653969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.654041 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.654100 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.654142 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.654167 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:35Z","lastTransitionTime":"2025-09-30T17:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.757843 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.757909 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.757929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.757960 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.757985 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:35Z","lastTransitionTime":"2025-09-30T17:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.860613 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.860670 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.860680 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.860696 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.860705 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:35Z","lastTransitionTime":"2025-09-30T17:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.897942 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.897936 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.898170 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:35 crc kubenswrapper[4772]: E0930 17:02:35.898428 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:35 crc kubenswrapper[4772]: E0930 17:02:35.898539 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:35 crc kubenswrapper[4772]: E0930 17:02:35.898663 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.967703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.967786 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.967817 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.967848 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:35 crc kubenswrapper[4772]: I0930 17:02:35.967872 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:35Z","lastTransitionTime":"2025-09-30T17:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.071941 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.072009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.072028 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.072092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.072114 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.175294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.175380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.175397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.175433 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.175453 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.278878 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.278923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.278934 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.278948 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.278958 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.382005 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.382091 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.382110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.382137 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.382157 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.485826 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.485890 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.485906 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.485929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.485947 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.588420 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.588497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.588515 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.588540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.588558 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.691556 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.691624 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.691643 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.691670 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.691690 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.789701 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.789789 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.789814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.789849 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.789872 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: E0930 17:02:36.815400 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:36Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.821221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.821298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.821318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.821344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.821364 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: E0930 17:02:36.837677 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:36Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.842401 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.842463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.842488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.842517 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.842537 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: E0930 17:02:36.864949 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:36Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.871241 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.871338 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.871361 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.871390 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.871411 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: E0930 17:02:36.894506 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:36Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.897608 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:36 crc kubenswrapper[4772]: E0930 17:02:36.897848 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.899678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.899748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.899765 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.899794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.899813 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:36 crc kubenswrapper[4772]: E0930 17:02:36.919094 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:36Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:36 crc kubenswrapper[4772]: E0930 17:02:36.919410 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.921339 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.921395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.921414 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.921440 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:36 crc kubenswrapper[4772]: I0930 17:02:36.921458 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:36Z","lastTransitionTime":"2025-09-30T17:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.025020 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.025108 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.025122 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.025143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.025156 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:37Z","lastTransitionTime":"2025-09-30T17:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.128719 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.128800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.128827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.128858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.128883 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:37Z","lastTransitionTime":"2025-09-30T17:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.232465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.232563 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.232587 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.232614 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.232633 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:37Z","lastTransitionTime":"2025-09-30T17:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.335502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.335585 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.335607 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.335925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.335946 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:37Z","lastTransitionTime":"2025-09-30T17:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.439725 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.439771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.439785 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.439804 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.439819 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:37Z","lastTransitionTime":"2025-09-30T17:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.543114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.543175 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.543193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.543219 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.543239 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:37Z","lastTransitionTime":"2025-09-30T17:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.646962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.647031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.647049 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.647110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.647128 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:37Z","lastTransitionTime":"2025-09-30T17:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.749945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.749987 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.749996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.750008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.750017 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:37Z","lastTransitionTime":"2025-09-30T17:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.853364 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.853404 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.853413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.853427 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.853436 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:37Z","lastTransitionTime":"2025-09-30T17:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.897711 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.897769 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:37 crc kubenswrapper[4772]: E0930 17:02:37.897826 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.897713 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:37 crc kubenswrapper[4772]: E0930 17:02:37.897923 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:37 crc kubenswrapper[4772]: E0930 17:02:37.898021 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.956965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.957001 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.957010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.957023 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:37 crc kubenswrapper[4772]: I0930 17:02:37.957034 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:37Z","lastTransitionTime":"2025-09-30T17:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.060761 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.060808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.060817 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.060832 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.060845 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:38Z","lastTransitionTime":"2025-09-30T17:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.163179 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.163213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.163230 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.163244 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.163256 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:38Z","lastTransitionTime":"2025-09-30T17:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.265674 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.265716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.265729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.265743 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.265752 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:38Z","lastTransitionTime":"2025-09-30T17:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.369150 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.369213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.369232 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.369256 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.369273 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:38Z","lastTransitionTime":"2025-09-30T17:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.472534 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.472595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.472608 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.472627 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.472642 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:38Z","lastTransitionTime":"2025-09-30T17:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.576497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.576608 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.576625 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.576654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.576676 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:38Z","lastTransitionTime":"2025-09-30T17:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.679197 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.679261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.679278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.679309 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.679328 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:38Z","lastTransitionTime":"2025-09-30T17:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.782389 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.782462 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.782481 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.782506 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.782528 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:38Z","lastTransitionTime":"2025-09-30T17:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.901580 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:38 crc kubenswrapper[4772]: E0930 17:02:38.901751 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.904577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.904638 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.904651 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.904671 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:38 crc kubenswrapper[4772]: I0930 17:02:38.904683 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:38Z","lastTransitionTime":"2025-09-30T17:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.008022 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.008107 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.008124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.008148 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.008169 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:39Z","lastTransitionTime":"2025-09-30T17:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.111829 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.111937 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.111956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.112261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.112291 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:39Z","lastTransitionTime":"2025-09-30T17:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.216207 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.216273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.216286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.216310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.216325 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:39Z","lastTransitionTime":"2025-09-30T17:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.319339 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.319429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.319461 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.319488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.319506 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:39Z","lastTransitionTime":"2025-09-30T17:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.423172 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.423234 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.423253 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.423277 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.423297 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:39Z","lastTransitionTime":"2025-09-30T17:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.527098 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.527141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.527153 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.527169 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.527181 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:39Z","lastTransitionTime":"2025-09-30T17:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.630020 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.630116 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.630138 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.630163 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.630180 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:39Z","lastTransitionTime":"2025-09-30T17:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.733814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.733879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.733896 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.733920 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.733941 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:39Z","lastTransitionTime":"2025-09-30T17:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.837095 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.837159 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.837177 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.837202 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.837222 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:39Z","lastTransitionTime":"2025-09-30T17:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.897417 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.897757 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:39 crc kubenswrapper[4772]: E0930 17:02:39.897891 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.897943 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:39 crc kubenswrapper[4772]: E0930 17:02:39.898331 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:39 crc kubenswrapper[4772]: E0930 17:02:39.898452 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.928044 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:29Z\\\",\\\"message\\\":\\\"g{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 17:02:29.053493 6390 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:02:29.053534 6390 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Pos\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:39Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.940872 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.940957 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.940973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.940990 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.941003 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:39Z","lastTransitionTime":"2025-09-30T17:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.944346 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:39Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.965452 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:39Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.981226 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:39Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:39 crc kubenswrapper[4772]: I0930 17:02:39.997484 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:39Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.015957 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c003aed8-5f6b-4b71-879e-02ee156d70f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9146b821cb907df28cb544ccd909a8c51761fde950ae641a64707c8cdbea71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60a9045ba0aad9b5d97eddbe7dc92e8ddc7ccc9e369c196b1632052f720e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e9c279258d92a669c674655e5259645adb62228ef7dd6ebdbbac8f18017d8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.030269 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.044030 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.044101 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.044110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.044127 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.044154 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:40Z","lastTransitionTime":"2025-09-30T17:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.048981 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.065317 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.082674 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.098014 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.115354 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.132606 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.145901 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.147261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.147355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.147712 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.147732 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.147742 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:40Z","lastTransitionTime":"2025-09-30T17:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.161462 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.175643 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.192100 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.250492 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.250547 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.250557 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.250572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.250582 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:40Z","lastTransitionTime":"2025-09-30T17:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.353591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.353999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.354021 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.354051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.354107 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:40Z","lastTransitionTime":"2025-09-30T17:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.458361 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.458439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.458459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.458482 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.458504 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:40Z","lastTransitionTime":"2025-09-30T17:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.562319 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.562377 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.562388 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.562406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.562420 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:40Z","lastTransitionTime":"2025-09-30T17:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.664793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.664845 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.664859 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.664877 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.664890 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:40Z","lastTransitionTime":"2025-09-30T17:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.767464 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.767504 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.767515 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.767530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.767541 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:40Z","lastTransitionTime":"2025-09-30T17:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.869921 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.869958 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.869969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.869984 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.869995 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:40Z","lastTransitionTime":"2025-09-30T17:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.897264 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:40 crc kubenswrapper[4772]: E0930 17:02:40.897450 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.973555 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.973599 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.973614 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.973635 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:40 crc kubenswrapper[4772]: I0930 17:02:40.973650 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:40Z","lastTransitionTime":"2025-09-30T17:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.076666 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.076733 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.076755 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.076784 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.076810 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:41Z","lastTransitionTime":"2025-09-30T17:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.180076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.180127 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.180136 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.180154 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.180164 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:41Z","lastTransitionTime":"2025-09-30T17:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.283634 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.283679 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.283705 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.283723 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.283737 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:41Z","lastTransitionTime":"2025-09-30T17:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.386223 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.386284 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.386299 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.386322 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.386336 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:41Z","lastTransitionTime":"2025-09-30T17:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.489404 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.489451 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.489468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.489491 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.489508 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:41Z","lastTransitionTime":"2025-09-30T17:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.593831 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.593873 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.593885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.593901 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.593912 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:41Z","lastTransitionTime":"2025-09-30T17:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.697991 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.698106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.698128 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.698151 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.698169 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:41Z","lastTransitionTime":"2025-09-30T17:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.802087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.802161 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.802174 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.802193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.802205 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:41Z","lastTransitionTime":"2025-09-30T17:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.897309 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.897417 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.897345 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:41 crc kubenswrapper[4772]: E0930 17:02:41.897539 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:41 crc kubenswrapper[4772]: E0930 17:02:41.897848 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:41 crc kubenswrapper[4772]: E0930 17:02:41.897918 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.908590 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.909012 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.909031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.909089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:41 crc kubenswrapper[4772]: I0930 17:02:41.909108 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:41Z","lastTransitionTime":"2025-09-30T17:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.012455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.012530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.012554 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.012583 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.012604 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:42Z","lastTransitionTime":"2025-09-30T17:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.120235 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.120315 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.120327 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.120345 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.120361 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:42Z","lastTransitionTime":"2025-09-30T17:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.223620 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.223673 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.223683 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.223735 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.223747 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:42Z","lastTransitionTime":"2025-09-30T17:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.327291 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.327375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.327394 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.327425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.327443 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:42Z","lastTransitionTime":"2025-09-30T17:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.431435 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.431467 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.431479 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.431497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.431510 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:42Z","lastTransitionTime":"2025-09-30T17:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.535406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.535463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.535481 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.535507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.535526 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:42Z","lastTransitionTime":"2025-09-30T17:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.638031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.638099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.638114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.638130 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.638141 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:42Z","lastTransitionTime":"2025-09-30T17:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.741036 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.741097 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.741109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.741127 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.741142 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:42Z","lastTransitionTime":"2025-09-30T17:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.843782 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.843826 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.843844 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.843869 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.843889 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:42Z","lastTransitionTime":"2025-09-30T17:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.897780 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:42 crc kubenswrapper[4772]: E0930 17:02:42.897941 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.946850 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.946885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.946897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.946915 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:42 crc kubenswrapper[4772]: I0930 17:02:42.946926 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:42Z","lastTransitionTime":"2025-09-30T17:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.049461 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.049497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.049510 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.049524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.049534 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:43Z","lastTransitionTime":"2025-09-30T17:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.152385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.152419 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.152428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.152444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.152454 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:43Z","lastTransitionTime":"2025-09-30T17:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.255649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.255693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.255725 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.255748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.255761 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:43Z","lastTransitionTime":"2025-09-30T17:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.357971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.358019 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.358032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.358048 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.358101 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:43Z","lastTransitionTime":"2025-09-30T17:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.461180 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.461260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.461278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.461309 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.461330 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:43Z","lastTransitionTime":"2025-09-30T17:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.564736 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.564790 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.564802 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.564821 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.564836 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:43Z","lastTransitionTime":"2025-09-30T17:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.668150 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.668205 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.668219 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.668237 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.668252 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:43Z","lastTransitionTime":"2025-09-30T17:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.771649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.771685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.771693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.771706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.771714 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:43Z","lastTransitionTime":"2025-09-30T17:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.874382 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.874435 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.874449 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.874464 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.874474 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:43Z","lastTransitionTime":"2025-09-30T17:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.897778 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.897823 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.897789 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:43 crc kubenswrapper[4772]: E0930 17:02:43.897917 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:43 crc kubenswrapper[4772]: E0930 17:02:43.898273 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:43 crc kubenswrapper[4772]: E0930 17:02:43.898409 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.898716 4772 scope.go:117] "RemoveContainer" containerID="7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367" Sep 30 17:02:43 crc kubenswrapper[4772]: E0930 17:02:43.898895 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.979295 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.979346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.979356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.979375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:43 crc kubenswrapper[4772]: I0930 17:02:43.979387 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:43Z","lastTransitionTime":"2025-09-30T17:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.082456 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.082508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.082521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.082543 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.082557 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:44Z","lastTransitionTime":"2025-09-30T17:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.185494 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.185549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.185560 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.185581 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.185595 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:44Z","lastTransitionTime":"2025-09-30T17:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.289587 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.289650 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.289663 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.289684 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.289704 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:44Z","lastTransitionTime":"2025-09-30T17:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.393190 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.393243 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.393256 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.393277 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.393291 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:44Z","lastTransitionTime":"2025-09-30T17:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.496508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.496578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.496591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.496610 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.496626 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:44Z","lastTransitionTime":"2025-09-30T17:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.599479 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.599531 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.599548 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.599571 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.599588 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:44Z","lastTransitionTime":"2025-09-30T17:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.701780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.701819 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.701830 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.701845 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.701856 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:44Z","lastTransitionTime":"2025-09-30T17:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.803966 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.804014 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.804025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.804040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.804051 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:44Z","lastTransitionTime":"2025-09-30T17:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.897299 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:44 crc kubenswrapper[4772]: E0930 17:02:44.897554 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.907248 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.907303 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.907318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.907342 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:44 crc kubenswrapper[4772]: I0930 17:02:44.907356 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:44Z","lastTransitionTime":"2025-09-30T17:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.011124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.011172 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.011187 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.011209 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.011224 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:45Z","lastTransitionTime":"2025-09-30T17:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.114466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.114514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.114525 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.114544 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.114557 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:45Z","lastTransitionTime":"2025-09-30T17:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.218538 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.218589 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.218600 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.218618 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.218631 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:45Z","lastTransitionTime":"2025-09-30T17:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.322111 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.322163 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.322173 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.322188 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.322199 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:45Z","lastTransitionTime":"2025-09-30T17:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.425256 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.425386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.425403 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.425421 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.425437 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:45Z","lastTransitionTime":"2025-09-30T17:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.527723 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.527778 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.527791 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.527811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.527823 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:45Z","lastTransitionTime":"2025-09-30T17:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.631003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.631044 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.631092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.631108 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.631165 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:45Z","lastTransitionTime":"2025-09-30T17:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.733209 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.733241 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.733248 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.733264 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.733275 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:45Z","lastTransitionTime":"2025-09-30T17:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.835310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.835340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.835349 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.835360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.835369 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:45Z","lastTransitionTime":"2025-09-30T17:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.897791 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.897818 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.897800 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:45 crc kubenswrapper[4772]: E0930 17:02:45.897975 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:45 crc kubenswrapper[4772]: E0930 17:02:45.898073 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:45 crc kubenswrapper[4772]: E0930 17:02:45.898146 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.938546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.938588 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.938596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.938611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:45 crc kubenswrapper[4772]: I0930 17:02:45.938622 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:45Z","lastTransitionTime":"2025-09-30T17:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.041433 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.041483 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.041494 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.041510 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.041522 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:46Z","lastTransitionTime":"2025-09-30T17:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.144072 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.144120 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.144131 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.144154 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.144165 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:46Z","lastTransitionTime":"2025-09-30T17:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.246594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.246662 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.246674 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.246695 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.246705 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:46Z","lastTransitionTime":"2025-09-30T17:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.350095 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.350154 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.350168 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.350190 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.350202 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:46Z","lastTransitionTime":"2025-09-30T17:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.453260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.453320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.453341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.453369 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.453394 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:46Z","lastTransitionTime":"2025-09-30T17:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.556157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.556265 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.556282 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.556308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.556329 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:46Z","lastTransitionTime":"2025-09-30T17:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.660289 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.660344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.660359 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.660379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.660393 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:46Z","lastTransitionTime":"2025-09-30T17:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.763368 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.763414 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.763424 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.763444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.763454 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:46Z","lastTransitionTime":"2025-09-30T17:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.866429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.866505 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.866517 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.866532 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.866542 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:46Z","lastTransitionTime":"2025-09-30T17:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.897313 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:46 crc kubenswrapper[4772]: E0930 17:02:46.897450 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.969859 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.969915 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.969927 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.969941 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:46 crc kubenswrapper[4772]: I0930 17:02:46.969950 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:46Z","lastTransitionTime":"2025-09-30T17:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.072674 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.072737 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.072749 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.072773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.072787 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.152534 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.152591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.152604 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.152622 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.152635 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.166890 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:47Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.171828 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.171902 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.171921 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.171953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.171973 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.186442 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:47Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.191423 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.191454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.191463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.191484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.191493 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.204517 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:47Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.210190 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.210249 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.210266 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.210292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.210310 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.230132 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:47Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.234674 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.234709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.234719 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.234734 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.234745 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.250738 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:47Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.250913 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.252657 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.252682 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.252692 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.252706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.252719 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.301330 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.301465 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.301543 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs podName:0f2541dd-c77d-4bc5-9771-6ac741731464 nodeName:}" failed. No retries permitted until 2025-09-30 17:03:19.301524922 +0000 UTC m=+100.208537753 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs") pod "network-metrics-daemon-wlgc4" (UID: "0f2541dd-c77d-4bc5-9771-6ac741731464") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.356243 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.356327 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.356340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.356362 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.356374 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.459220 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.459249 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.459257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.459270 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.459280 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.562146 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.562174 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.562184 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.562198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.562207 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.665292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.665321 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.665329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.665341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.665351 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.767734 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.767779 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.767794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.767810 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.767821 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.870991 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.871046 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.871084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.871106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.871120 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.897732 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.897810 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.897732 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.897934 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.898131 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:47 crc kubenswrapper[4772]: E0930 17:02:47.898230 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.974178 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.974230 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.974248 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.974273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:47 crc kubenswrapper[4772]: I0930 17:02:47.974291 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:47Z","lastTransitionTime":"2025-09-30T17:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.077865 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.077907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.077919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.077940 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.077953 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:48Z","lastTransitionTime":"2025-09-30T17:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.181767 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.181838 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.181849 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.181866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.181878 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:48Z","lastTransitionTime":"2025-09-30T17:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.283608 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.283637 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.283645 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.283659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.283670 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:48Z","lastTransitionTime":"2025-09-30T17:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.385730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.385757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.385765 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.385777 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.385787 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:48Z","lastTransitionTime":"2025-09-30T17:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.488273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.488307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.488316 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.488329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.488338 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:48Z","lastTransitionTime":"2025-09-30T17:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.590958 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.591044 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.591111 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.591143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.591169 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:48Z","lastTransitionTime":"2025-09-30T17:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.693869 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.693910 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.693923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.693939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.693949 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:48Z","lastTransitionTime":"2025-09-30T17:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.796748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.796786 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.796797 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.796810 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.796820 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:48Z","lastTransitionTime":"2025-09-30T17:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.898154 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:48 crc kubenswrapper[4772]: E0930 17:02:48.898318 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.899685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.899711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.899722 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.899735 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:48 crc kubenswrapper[4772]: I0930 17:02:48.899745 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:48Z","lastTransitionTime":"2025-09-30T17:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.002923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.002988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.003005 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.003031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.003047 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:49Z","lastTransitionTime":"2025-09-30T17:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.105858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.105894 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.105913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.105929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.105942 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:49Z","lastTransitionTime":"2025-09-30T17:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.208604 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.208641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.208649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.208664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.208673 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:49Z","lastTransitionTime":"2025-09-30T17:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.311604 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.311642 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.311650 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.311664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.311673 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:49Z","lastTransitionTime":"2025-09-30T17:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.366482 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7br52_5e5b90d4-3f5e-49d8-b2c5-175948eeeda6/kube-multus/0.log" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.366590 4772 generic.go:334] "Generic (PLEG): container finished" podID="5e5b90d4-3f5e-49d8-b2c5-175948eeeda6" containerID="6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1" exitCode=1 Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.366629 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7br52" event={"ID":"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6","Type":"ContainerDied","Data":"6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.367113 4772 scope.go:117] "RemoveContainer" containerID="6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.380950 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:49Z\\\",\\\"message\\\":\\\"2025-09-30T17:02:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0c1f2d75-2b24-4385-ae14-882425581075\\\\n2025-09-30T17:02:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0c1f2d75-2b24-4385-ae14-882425581075 to /host/opt/cni/bin/\\\\n2025-09-30T17:02:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:02:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:02:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.395379 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.409004 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c003aed8-5f6b-4b71-879e-02ee156d70f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9146b821cb907df28cb544ccd909a8c51761fde950ae641a64707c8cdbea71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60a9045ba0aad9b5d97eddbe7dc92e8ddc7ccc9e369c196b1632052f720e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e9c279258d92a669c674655e5259645adb62228ef7dd6ebdbbac8f18017d8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.413908 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.413943 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.413955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.413971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.413983 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:49Z","lastTransitionTime":"2025-09-30T17:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.419941 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.435729 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.450047 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.467200 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.477239 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.493871 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.503237 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.516263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.516518 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.516583 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.516645 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.516706 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:49Z","lastTransitionTime":"2025-09-30T17:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.517453 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.529845 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.540993 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.554722 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.573252 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:29Z\\\",\\\"message\\\":\\\"g{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 17:02:29.053493 6390 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:02:29.053534 6390 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Pos\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.585558 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.597217 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.618553 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.618755 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.618841 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.618902 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.618956 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:49Z","lastTransitionTime":"2025-09-30T17:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.721551 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.721919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.722015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.722131 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.722223 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:49Z","lastTransitionTime":"2025-09-30T17:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.826918 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.826960 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.826971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.826986 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.826999 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:49Z","lastTransitionTime":"2025-09-30T17:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.899747 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:49 crc kubenswrapper[4772]: E0930 17:02:49.899878 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.900030 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.900132 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:49 crc kubenswrapper[4772]: E0930 17:02:49.900212 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:49 crc kubenswrapper[4772]: E0930 17:02:49.900351 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.919761 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.929274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.929328 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.929339 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.929358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.929368 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:49Z","lastTransitionTime":"2025-09-30T17:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.938860 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.952158 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.964829 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:49 crc kubenswrapper[4772]: I0930 17:02:49.990702 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:29Z\\\",\\\"message\\\":\\\"g{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 17:02:29.053493 6390 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:02:29.053534 6390 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Pos\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:49Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.006823 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c003aed8-5f6b-4b71-879e-02ee156d70f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9146b821cb907df28cb544ccd909a8c51761fde950ae641a64707c8cdbea71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60a9045ba0aad9b5d97eddbe7dc92e8ddc7ccc9e369c196b1632052f720e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e9c279258d92a669c674655e5259645adb62228ef7dd6ebdbbac8f18017d8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.023971 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.030788 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.030827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.030837 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.030868 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.030879 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:50Z","lastTransitionTime":"2025-09-30T17:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.044000 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:49Z\\\",\\\"message\\\":\\\"2025-09-30T17:02:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0c1f2d75-2b24-4385-ae14-882425581075\\\\n2025-09-30T17:02:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0c1f2d75-2b24-4385-ae14-882425581075 to /host/opt/cni/bin/\\\\n2025-09-30T17:02:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:02:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:02:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.058246 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.075627 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.092593 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.107858 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.119871 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.132834 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.132893 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.132907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.132928 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.132941 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:50Z","lastTransitionTime":"2025-09-30T17:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.134971 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.149021 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.167288 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.179936 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.235580 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.235764 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.235854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.235953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.236075 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:50Z","lastTransitionTime":"2025-09-30T17:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.338976 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.339025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.339036 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.339071 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.339086 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:50Z","lastTransitionTime":"2025-09-30T17:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.373114 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7br52_5e5b90d4-3f5e-49d8-b2c5-175948eeeda6/kube-multus/0.log" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.373709 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7br52" event={"ID":"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6","Type":"ContainerStarted","Data":"8ef1189b32001cded42b3c4fd17f81a9c4075e8b0f54d72799fa4306e83cd670"} Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.392404 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.411225 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.424455 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.441294 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.442702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.442773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.442794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.442824 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.442843 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:50Z","lastTransitionTime":"2025-09-30T17:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.474772 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:29Z\\\",\\\"message\\\":\\\"g{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 17:02:29.053493 6390 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:02:29.053534 6390 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Pos\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.497455 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c003aed8-5f6b-4b71-879e-02ee156d70f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9146b821cb907df28cb544ccd909a8c51761fde950ae641a64707c8cdbea71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60a9045ba0aad9b5d97eddbe7dc92e8ddc7ccc9e369c196b1632052f720e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e9c279258d92a669c674655e5259645adb62228ef7dd6ebdbbac8f18017d8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.510933 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.532713 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ef1189b32001cded42b3c4fd17f81a9c4075e8b0f54d72799fa4306e83cd670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:49Z\\\",\\\"message\\\":\\\"2025-09-30T17:02:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0c1f2d75-2b24-4385-ae14-882425581075\\\\n2025-09-30T17:02:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0c1f2d75-2b24-4385-ae14-882425581075 to /host/opt/cni/bin/\\\\n2025-09-30T17:02:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:02:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:02:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.546030 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.546103 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.546115 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.546136 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.546151 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:50Z","lastTransitionTime":"2025-09-30T17:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.553635 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.570334 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.587692 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.607959 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.623767 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.639367 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.649482 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.649533 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.649549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.649570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.649586 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:50Z","lastTransitionTime":"2025-09-30T17:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.655679 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.676650 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.690035 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.752314 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.752363 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.752377 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.752396 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.752411 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:50Z","lastTransitionTime":"2025-09-30T17:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.855037 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.855355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.855483 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.855559 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.855645 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:50Z","lastTransitionTime":"2025-09-30T17:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.897484 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:50 crc kubenswrapper[4772]: E0930 17:02:50.898048 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.963456 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.963523 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.963539 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.963562 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:50 crc kubenswrapper[4772]: I0930 17:02:50.963577 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:50Z","lastTransitionTime":"2025-09-30T17:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.065932 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.065970 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.065983 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.066001 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.066012 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:51Z","lastTransitionTime":"2025-09-30T17:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.168879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.168975 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.168992 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.169015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.169032 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:51Z","lastTransitionTime":"2025-09-30T17:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.277417 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.277458 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.277468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.277485 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.277497 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:51Z","lastTransitionTime":"2025-09-30T17:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.379532 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.379592 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.379603 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.379619 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.379639 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:51Z","lastTransitionTime":"2025-09-30T17:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.482775 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.482810 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.482819 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.482833 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.482844 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:51Z","lastTransitionTime":"2025-09-30T17:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.585848 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.585902 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.585911 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.585928 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.585940 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:51Z","lastTransitionTime":"2025-09-30T17:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.688791 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.688854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.688867 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.688885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.688899 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:51Z","lastTransitionTime":"2025-09-30T17:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.791535 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.791579 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.791589 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.791606 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.791616 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:51Z","lastTransitionTime":"2025-09-30T17:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.895120 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.895181 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.895193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.895213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.895226 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:51Z","lastTransitionTime":"2025-09-30T17:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.897393 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:51 crc kubenswrapper[4772]: E0930 17:02:51.897574 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.897786 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:51 crc kubenswrapper[4772]: E0930 17:02:51.897853 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.898156 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:51 crc kubenswrapper[4772]: E0930 17:02:51.898264 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.998549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.998617 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.998635 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.998670 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:51 crc kubenswrapper[4772]: I0930 17:02:51.998690 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:51Z","lastTransitionTime":"2025-09-30T17:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.101622 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.101681 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.101695 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.101716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.101732 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:52Z","lastTransitionTime":"2025-09-30T17:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.205105 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.205167 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.205184 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.205210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.205229 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:52Z","lastTransitionTime":"2025-09-30T17:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.308279 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.308312 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.308319 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.308333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.308344 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:52Z","lastTransitionTime":"2025-09-30T17:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.410851 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.410901 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.410914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.410933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.410950 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:52Z","lastTransitionTime":"2025-09-30T17:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.514755 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.514816 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.514835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.514862 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.514879 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:52Z","lastTransitionTime":"2025-09-30T17:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.618220 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.618319 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.618357 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.618394 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.618419 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:52Z","lastTransitionTime":"2025-09-30T17:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.721437 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.721521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.721540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.721569 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.721591 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:52Z","lastTransitionTime":"2025-09-30T17:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.824880 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.824959 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.824980 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.825009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.825029 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:52Z","lastTransitionTime":"2025-09-30T17:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.897307 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:52 crc kubenswrapper[4772]: E0930 17:02:52.897547 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.928449 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.928488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.928502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.928519 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:52 crc kubenswrapper[4772]: I0930 17:02:52.928529 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:52Z","lastTransitionTime":"2025-09-30T17:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.031145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.031173 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.031181 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.031195 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.031204 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:53Z","lastTransitionTime":"2025-09-30T17:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.134408 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.134451 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.134464 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.134480 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.134492 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:53Z","lastTransitionTime":"2025-09-30T17:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.237300 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.237405 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.237425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.237448 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.237466 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:53Z","lastTransitionTime":"2025-09-30T17:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.340940 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.341456 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.341625 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.341758 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.341898 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:53Z","lastTransitionTime":"2025-09-30T17:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.445120 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.445533 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.445731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.445881 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.446032 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:53Z","lastTransitionTime":"2025-09-30T17:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.549571 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.550298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.550384 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.550541 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.550639 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:53Z","lastTransitionTime":"2025-09-30T17:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.654426 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.654471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.654481 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.654496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.654505 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:53Z","lastTransitionTime":"2025-09-30T17:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.757300 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.757361 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.757379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.757419 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.757456 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:53Z","lastTransitionTime":"2025-09-30T17:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.860526 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.860570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.860580 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.860596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.860607 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:53Z","lastTransitionTime":"2025-09-30T17:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.897956 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.897954 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:53 crc kubenswrapper[4772]: E0930 17:02:53.898127 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:53 crc kubenswrapper[4772]: E0930 17:02:53.898494 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.898662 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:53 crc kubenswrapper[4772]: E0930 17:02:53.898830 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.963260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.963284 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.963292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.963305 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:53 crc kubenswrapper[4772]: I0930 17:02:53.963313 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:53Z","lastTransitionTime":"2025-09-30T17:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.066199 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.066250 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.066260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.066275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.066284 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:54Z","lastTransitionTime":"2025-09-30T17:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.169145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.169189 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.169198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.169213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.169222 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:54Z","lastTransitionTime":"2025-09-30T17:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.271908 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.271948 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.271957 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.271971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.271981 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:54Z","lastTransitionTime":"2025-09-30T17:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.375046 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.375154 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.375165 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.375185 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.375202 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:54Z","lastTransitionTime":"2025-09-30T17:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.477814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.477876 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.477889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.477908 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.477920 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:54Z","lastTransitionTime":"2025-09-30T17:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.580736 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.580800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.580819 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.580843 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.580862 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:54Z","lastTransitionTime":"2025-09-30T17:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.682933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.682996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.683009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.683027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.683039 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:54Z","lastTransitionTime":"2025-09-30T17:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.786375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.786445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.786468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.786497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.786520 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:54Z","lastTransitionTime":"2025-09-30T17:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.889508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.889540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.889549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.889562 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.889571 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:54Z","lastTransitionTime":"2025-09-30T17:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.898152 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:54 crc kubenswrapper[4772]: E0930 17:02:54.898533 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.914893 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.992815 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.992886 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.992902 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.992932 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:54 crc kubenswrapper[4772]: I0930 17:02:54.992951 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:54Z","lastTransitionTime":"2025-09-30T17:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.096274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.096337 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.096356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.096379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.096396 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:55Z","lastTransitionTime":"2025-09-30T17:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.199665 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.199731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.199748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.199773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.199788 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:55Z","lastTransitionTime":"2025-09-30T17:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.303558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.303670 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.303688 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.303734 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.303752 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:55Z","lastTransitionTime":"2025-09-30T17:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.406996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.407081 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.407098 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.407121 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.407139 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:55Z","lastTransitionTime":"2025-09-30T17:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.510288 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.510376 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.510405 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.510441 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.510462 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:55Z","lastTransitionTime":"2025-09-30T17:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.613929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.614027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.614044 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.614109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.614134 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:55Z","lastTransitionTime":"2025-09-30T17:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.717566 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.717641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.717664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.717687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.717704 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:55Z","lastTransitionTime":"2025-09-30T17:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.820627 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.820730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.820825 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.820862 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.820946 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:55Z","lastTransitionTime":"2025-09-30T17:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.897900 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.897958 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.897900 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:55 crc kubenswrapper[4772]: E0930 17:02:55.898119 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:55 crc kubenswrapper[4772]: E0930 17:02:55.898315 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:55 crc kubenswrapper[4772]: E0930 17:02:55.898418 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.924672 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.924730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.924742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.924766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:55 crc kubenswrapper[4772]: I0930 17:02:55.924778 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:55Z","lastTransitionTime":"2025-09-30T17:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.028381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.028735 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.028927 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.029178 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.029358 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:56Z","lastTransitionTime":"2025-09-30T17:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.133176 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.133242 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.133264 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.133294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.133322 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:56Z","lastTransitionTime":"2025-09-30T17:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.236832 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.236905 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.236930 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.236966 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.236990 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:56Z","lastTransitionTime":"2025-09-30T17:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.339513 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.339556 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.339568 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.339584 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.339597 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:56Z","lastTransitionTime":"2025-09-30T17:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.441919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.441955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.441963 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.441978 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.441992 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:56Z","lastTransitionTime":"2025-09-30T17:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.544786 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.544826 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.544836 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.544855 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.544867 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:56Z","lastTransitionTime":"2025-09-30T17:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.648081 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.648135 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.648147 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.648166 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.648183 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:56Z","lastTransitionTime":"2025-09-30T17:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.751174 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.751238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.751262 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.751292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.751321 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:56Z","lastTransitionTime":"2025-09-30T17:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.854513 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.854562 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.854589 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.854633 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.854657 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:56Z","lastTransitionTime":"2025-09-30T17:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.897350 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:56 crc kubenswrapper[4772]: E0930 17:02:56.897553 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.898836 4772 scope.go:117] "RemoveContainer" containerID="7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.958379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.958435 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.958444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.958458 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:56 crc kubenswrapper[4772]: I0930 17:02:56.958467 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:56Z","lastTransitionTime":"2025-09-30T17:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.060570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.060611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.060621 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.060650 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.060661 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.163964 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.164047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.164105 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.164141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.164167 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.267232 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.267291 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.267308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.267332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.267355 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.369926 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.369988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.370000 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.370016 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.370028 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.401619 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/2.log" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.472819 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.472858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.472876 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.472896 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.472908 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.575195 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.575235 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.575245 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.575260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.575270 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.612502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.612548 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.612564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.612596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.612611 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: E0930 17:02:57.627659 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.632382 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.632430 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.632442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.632457 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.632467 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: E0930 17:02:57.651790 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.655950 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.655990 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.656004 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.656018 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.656027 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: E0930 17:02:57.675980 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.680675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.680723 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.680733 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.680748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.680762 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: E0930 17:02:57.698167 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.711188 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.711229 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.711239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.711252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.711261 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: E0930 17:02:57.724626 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8cd548ba-29ed-4d2b-b59b-8b79e6073d1d\\\",\\\"systemUUID\\\":\\\"0dcd8a16-1277-4116-9b8a-7e3bf2155fd4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:57 crc kubenswrapper[4772]: E0930 17:02:57.724747 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.726241 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.726282 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.726291 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.726305 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.726314 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.829919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.829964 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.829974 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.829990 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.830000 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.897312 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.897360 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.897426 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:57 crc kubenswrapper[4772]: E0930 17:02:57.897449 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:57 crc kubenswrapper[4772]: E0930 17:02:57.897558 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:57 crc kubenswrapper[4772]: E0930 17:02:57.897647 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.932805 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.932854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.932866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.932885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:57 crc kubenswrapper[4772]: I0930 17:02:57.932898 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:57Z","lastTransitionTime":"2025-09-30T17:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.035411 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.035456 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.035469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.035487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.035500 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:58Z","lastTransitionTime":"2025-09-30T17:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.138035 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.138112 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.138125 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.138143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.138155 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:58Z","lastTransitionTime":"2025-09-30T17:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.240866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.240905 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.240914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.240929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.240938 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:58Z","lastTransitionTime":"2025-09-30T17:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.343753 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.343808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.343821 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.343844 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.343861 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:58Z","lastTransitionTime":"2025-09-30T17:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.413139 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/3.log" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.413714 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/2.log" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.416355 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc" exitCode=1 Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.416412 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.416520 4772 scope.go:117] "RemoveContainer" containerID="7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.417273 4772 scope.go:117] "RemoveContainer" containerID="39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc" Sep 30 17:02:58 crc kubenswrapper[4772]: E0930 17:02:58.417448 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.436208 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.446645 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.446689 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.446703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.446720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.446734 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:58Z","lastTransitionTime":"2025-09-30T17:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.451246 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.465549 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.480659 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.497434 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535f199d-4a89-433b-aac8-3f2724b83ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df798ee45454483b34381b323fbd737cd341c65028ecb28daa91980909a9c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb26b902c619b0a8b18b90ab720669bd5fbb4bda0c24d38aa06141d77cfe29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb26b902c619b0a8b18b90ab720669bd5fbb4bda0c24d38aa06141d77cfe29f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.513717 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.531459 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.550528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.550569 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.550577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.550594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.550606 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:58Z","lastTransitionTime":"2025-09-30T17:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.550771 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.571420 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.586239 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.605166 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.619255 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.642265 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:29Z\\\",\\\"message\\\":\\\"g{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 17:02:29.053493 6390 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:02:29.053534 6390 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Pos\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:58Z\\\",\\\"message\\\":\\\"e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:02:58.056603 6736 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:02:58.056810 6736 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 17:02:58.056817 6736 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0930 17:02:58.056823 6736 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:02:58.056369 6736 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0930 17:02:58.056764 6736 model_client.go:382] Update operations generated as: [{Op:update Table\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.654317 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.654358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.654370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.654386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.654398 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:58Z","lastTransitionTime":"2025-09-30T17:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.661953 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.678637 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.699321 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ef1189b32001cded42b3c4fd17f81a9c4075e8b0f54d72799fa4306e83cd670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:49Z\\\",\\\"message\\\":\\\"2025-09-30T17:02:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0c1f2d75-2b24-4385-ae14-882425581075\\\\n2025-09-30T17:02:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0c1f2d75-2b24-4385-ae14-882425581075 to /host/opt/cni/bin/\\\\n2025-09-30T17:02:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:02:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:02:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.711838 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.726325 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c003aed8-5f6b-4b71-879e-02ee156d70f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9146b821cb907df28cb544ccd909a8c51761fde950ae641a64707c8cdbea71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60a9045ba0aad9b5d97eddbe7dc92e8ddc7ccc9e369c196b1632052f720e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e9c279258d92a669c674655e5259645adb62228ef7dd6ebdbbac8f18017d8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.757576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.757610 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.757619 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.757632 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.757641 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:58Z","lastTransitionTime":"2025-09-30T17:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.860973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.861050 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.861099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.861136 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.861155 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:58Z","lastTransitionTime":"2025-09-30T17:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.897962 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:02:58 crc kubenswrapper[4772]: E0930 17:02:58.898260 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.963913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.963974 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.963984 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.964001 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:58 crc kubenswrapper[4772]: I0930 17:02:58.964011 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:58Z","lastTransitionTime":"2025-09-30T17:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.068525 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.068640 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.068659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.068684 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.068701 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:59Z","lastTransitionTime":"2025-09-30T17:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.173306 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.173387 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.173406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.173906 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.173946 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:59Z","lastTransitionTime":"2025-09-30T17:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.280719 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.280776 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.280791 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.280811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.280830 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:59Z","lastTransitionTime":"2025-09-30T17:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.384096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.384488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.384500 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.384516 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.384528 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:59Z","lastTransitionTime":"2025-09-30T17:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.421843 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/3.log" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.487462 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.487521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.487533 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.487550 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.487562 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:59Z","lastTransitionTime":"2025-09-30T17:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.591298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.591383 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.591410 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.591444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.591470 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:59Z","lastTransitionTime":"2025-09-30T17:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.694387 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.694460 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.694478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.694502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.694520 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:59Z","lastTransitionTime":"2025-09-30T17:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.797343 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.797425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.797471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.797489 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.797500 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:59Z","lastTransitionTime":"2025-09-30T17:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.897968 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:02:59 crc kubenswrapper[4772]: E0930 17:02:59.898143 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.897968 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.898217 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:02:59 crc kubenswrapper[4772]: E0930 17:02:59.898287 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:02:59 crc kubenswrapper[4772]: E0930 17:02:59.898448 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.899639 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.899675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.899686 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.899702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.899716 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:02:59Z","lastTransitionTime":"2025-09-30T17:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.915676 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263c9ae4-c3a0-4582-b53e-ad548284fef0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3a83bdc1abee2d4afdb2b0ec772315c9833b9b5774f50977aca72a61ab45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7587e47c9cbd2ea3b96235c3b5177d8f64eaa1d016002c8b4cc80e7c1d66a6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0186f7aa6b1264ad65ca1429214a479e754a091b5d818cac6cd88184226b8b66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2099c77f94a575fe68dfbabf0e5341a587fb6b664d2ea0909be9d0f58519dd3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.937090 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.951688 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f2541dd-c77d-4bc5-9771-6ac741731464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlgc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.964370 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535f199d-4a89-433b-aac8-3f2724b83ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df798ee45454483b34381b323fbd737cd341c65028ecb28daa91980909a9c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb26b902c619b0a8b18b90ab720669bd5fbb4bda0c24d38aa06141d77cfe29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb26b902c619b0a8b18b90ab720669bd5fbb4bda0c24d38aa06141d77cfe29f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.981919 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d98a003-e8df-4c23-b635-2a9d65bbd543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d084ec76785d5bfe00e11b2822646901d7567240f3c5a4df0493cd903c4c4519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cecf717fe5013b929053fc2f7e04a50461ee1baa49ac8d15a5ee328ba290051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633f883bd54db0d64100360657e84a67a6326f6f8d454f6da0966e1948a86845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d11bccd46d6dec11a1f9c86aa900e15c2eb44cefdecd4f476fe08479a585ba2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f6b0c44cf133e8e57000badcf8988158290c53988eb3aa409b9ee77b9efdf1f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:01:54Z\\\",\\\"message\\\":\\\"W0930 17:01:43.214318 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:01:43.214871 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759251703 cert, and key in /tmp/serving-cert-1786094550/serving-signer.crt, /tmp/serving-cert-1786094550/serving-signer.key\\\\nI0930 17:01:43.575285 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:01:43.577891 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:01:43.578120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:01:43.580769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786094550/tls.crt::/tmp/serving-cert-1786094550/tls.key\\\\\\\"\\\\nF0930 17:01:54.186473 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3abb219b2501b77b5be9f7d414eae306befcaddf0e71d0d5d700ddde816949b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3119624ac57befeef91ac7a2c59b9b570e6102c808d8483ecf6f072392e1d6d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:02:59 crc kubenswrapper[4772]: I0930 17:02:59.996261 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j5z7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82faca8b-622c-4731-a320-ff2bc04d040b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0115a80e9ebac2112f3306db1489a926859d997aa2119e7b66af2c7ed21ac04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m78s6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j5z7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.006036 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.006170 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.006194 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.006221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.006241 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:00Z","lastTransitionTime":"2025-09-30T17:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.010496 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52c7d97913c52d26ece5d6c71351e02fd579dead2d26e39c2627ea50262af7dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010aff977af1cf258453257cb4ab71fd1d3f961f397caf1b2e7a6b742472169b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.025846 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd315e88e40ec02a051096cf2f5841e7241504c492e81385fb2cf7d3ce627cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.047434 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47rqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec22019d-863b-4e4b-98a9-1ceaa9fbd9f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f67e210e8a9b9d6a23228c57295c353be42c7d6f9a34c42d8998bf268dc1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7aba5e07a96559627077936145fb06c8a91e1c293d5f955d6f45e953cb54e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8bd0ba668b5e8a31160a303ee1fdc57b143ed688cf75ad494a82364eb3334dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78428d6520c13c1060a37ec777f2d21ab0beecbaad36dfbfe11d4bccd84d8441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7467d578874de990a7e3a17794aa2ebd69a7351d60c839320756fc1a24024ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d9fb12a41b3a521d18b1cbdf50315a5c98639d26cedaf3e5fa80c6d3f7ed7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b81960046175e3f8e370df1eaf6396c011e18d1597d46222e472558ae24ded\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrxtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47rqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.065095 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fddb68762e1a58136bb74bae4599cb31ba367657c36cfe9e6e4183a05f7d07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rkhll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.094457 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47daa5db-853e-45af-98ae-489980c97641\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea599115365192b9f3529ede045b17f53f24d4f57479c85d605602394bdb367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:29Z\\\",\\\"message\\\":\\\"g{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 17:02:29.053493 6390 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:02:29.053534 6390 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Pos\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:58Z\\\",\\\"message\\\":\\\"e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:02:58Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:02:58.056603 6736 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:02:58.056810 6736 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 17:02:58.056817 6736 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0930 17:02:58.056823 6736 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:02:58.056369 6736 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0930 17:02:58.056764 6736 model_client.go:382] Update operations generated as: [{Op:update Table\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:02:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27g86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bj99l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.108614 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.110308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.110381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.110401 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.110426 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.110444 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:00Z","lastTransitionTime":"2025-09-30T17:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.123364 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.137980 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19051430dde4d8df16e80bc4af56d7b9d993b7de7690ca02ba3519bd9457faf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.151875 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63c1dd91-22dc-4f0e-aca4-1a609b6cdf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218ac01192ab189c2a26325c0df461eb4cfd46da0407d454c660762e5090e957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4064f0554aad0ac8d0766459c59cfa183822d4c54583bfafc6f37ed2ea2c8f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bfhvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jm5rd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.165203 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c003aed8-5f6b-4b71-879e-02ee156d70f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9146b821cb907df28cb544ccd909a8c51761fde950ae641a64707c8cdbea71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60a9045ba0aad9b5d97eddbe7dc92e8ddc7ccc9e369c196b1632052f720e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e9c279258d92a669c674655e5259645adb62228ef7dd6ebdbbac8f18017d8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dec6b1362d95a00b14f443371a18d86c878e43824bb2080ccabf60a88843e01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:01:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:01:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.179219 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k2jvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f081528-51e8-4088-bb5c-f51e7ab0bc7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee41dada11b5b88ecb5dd36f2c141d8457ed4ec927858f3fe9227c3467a625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fqk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k2jvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.193770 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7br52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ef1189b32001cded42b3c4fd17f81a9c4075e8b0f54d72799fa4306e83cd670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:02:49Z\\\",\\\"message\\\":\\\"2025-09-30T17:02:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0c1f2d75-2b24-4385-ae14-882425581075\\\\n2025-09-30T17:02:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0c1f2d75-2b24-4385-ae14-882425581075 to /host/opt/cni/bin/\\\\n2025-09-30T17:02:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:02:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:02:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:02:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr9kl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:02:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7br52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:03:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.213773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.213823 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.213837 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.213858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.213872 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:00Z","lastTransitionTime":"2025-09-30T17:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.316891 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.317286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.317360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.317443 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.317512 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:00Z","lastTransitionTime":"2025-09-30T17:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.420440 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.420501 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.420520 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.420548 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.420583 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:00Z","lastTransitionTime":"2025-09-30T17:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.524316 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.524399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.524422 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.524449 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.524475 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:00Z","lastTransitionTime":"2025-09-30T17:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.627095 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.627157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.627210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.627239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.627259 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:00Z","lastTransitionTime":"2025-09-30T17:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.730521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.730564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.730573 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.730591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.730601 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:00Z","lastTransitionTime":"2025-09-30T17:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.834777 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.834820 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.834834 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.834854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.834870 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:00Z","lastTransitionTime":"2025-09-30T17:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.897829 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:00 crc kubenswrapper[4772]: E0930 17:03:00.898294 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.936939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.936978 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.936990 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.937005 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:00 crc kubenswrapper[4772]: I0930 17:03:00.937016 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:00Z","lastTransitionTime":"2025-09-30T17:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.041821 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.041889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.041908 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.041936 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.041958 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:01Z","lastTransitionTime":"2025-09-30T17:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.146164 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.146247 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.146271 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.146304 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.146322 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:01Z","lastTransitionTime":"2025-09-30T17:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.249710 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.249771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.249789 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.249815 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.249836 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:01Z","lastTransitionTime":"2025-09-30T17:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.354638 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.354707 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.354726 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.354758 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.354781 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:01Z","lastTransitionTime":"2025-09-30T17:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.457368 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.457423 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.457442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.457466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.457487 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:01Z","lastTransitionTime":"2025-09-30T17:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.560677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.560720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.560730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.560745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.560756 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:01Z","lastTransitionTime":"2025-09-30T17:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.665514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.666206 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.666644 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.666685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.666706 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:01Z","lastTransitionTime":"2025-09-30T17:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.771109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.771175 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.771192 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.771221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.771239 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:01Z","lastTransitionTime":"2025-09-30T17:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.875216 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.875308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.875352 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.875387 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.875410 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:01Z","lastTransitionTime":"2025-09-30T17:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.898330 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.898379 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.898553 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:01 crc kubenswrapper[4772]: E0930 17:03:01.898566 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:01 crc kubenswrapper[4772]: E0930 17:03:01.898669 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:01 crc kubenswrapper[4772]: E0930 17:03:01.898758 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.979305 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.979378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.979399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.979427 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:01 crc kubenswrapper[4772]: I0930 17:03:01.979454 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:01Z","lastTransitionTime":"2025-09-30T17:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.087934 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.088005 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.088024 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.088079 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.088104 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:02Z","lastTransitionTime":"2025-09-30T17:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.191635 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.191690 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.191702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.191721 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.191734 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:02Z","lastTransitionTime":"2025-09-30T17:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.294768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.294844 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.294862 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.294891 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.294910 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:02Z","lastTransitionTime":"2025-09-30T17:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.399359 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.399425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.399443 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.399473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.399493 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:02Z","lastTransitionTime":"2025-09-30T17:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.503695 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.503781 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.503804 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.503834 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.503858 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:02Z","lastTransitionTime":"2025-09-30T17:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.607280 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.607335 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.607352 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.607379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.607396 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:02Z","lastTransitionTime":"2025-09-30T17:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.710528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.710929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.711087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.711235 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.711368 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:02Z","lastTransitionTime":"2025-09-30T17:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.813988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.814041 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.814053 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.814098 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.814116 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:02Z","lastTransitionTime":"2025-09-30T17:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.897316 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:02 crc kubenswrapper[4772]: E0930 17:03:02.897631 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.918157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.918207 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.918218 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.918242 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:02 crc kubenswrapper[4772]: I0930 17:03:02.918259 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:02Z","lastTransitionTime":"2025-09-30T17:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.020984 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.021043 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.021067 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.021087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.021098 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:03Z","lastTransitionTime":"2025-09-30T17:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.125218 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.125310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.125334 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.125372 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.125444 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:03Z","lastTransitionTime":"2025-09-30T17:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.228379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.228813 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.228939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.229082 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.229194 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:03Z","lastTransitionTime":"2025-09-30T17:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.332892 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.333309 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.333380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.333418 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.333443 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:03Z","lastTransitionTime":"2025-09-30T17:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.437996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.438099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.438110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.438131 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.438141 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:03Z","lastTransitionTime":"2025-09-30T17:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.540824 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.540862 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.540884 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.540901 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.540911 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:03Z","lastTransitionTime":"2025-09-30T17:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.644378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.644477 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.644493 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.644516 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.644528 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:03Z","lastTransitionTime":"2025-09-30T17:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.748239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.748352 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.748371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.748399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.748420 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:03Z","lastTransitionTime":"2025-09-30T17:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.819479 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.819696 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.819808 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:07.819743896 +0000 UTC m=+148.726756767 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.819942 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.819987 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.820174 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820258 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:04:07.820179387 +0000 UTC m=+148.727192258 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820320 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820358 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820364 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.820375 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820404 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820448 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:04:07.820427393 +0000 UTC m=+148.727440384 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820494 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:04:07.820471845 +0000 UTC m=+148.727484716 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820565 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820588 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820616 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.820684 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:04:07.82066522 +0000 UTC m=+148.727678081 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.852552 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.852634 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.852653 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.852686 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.852707 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:03Z","lastTransitionTime":"2025-09-30T17:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.899832 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.899966 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.900158 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.900164 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.900332 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:03 crc kubenswrapper[4772]: E0930 17:03:03.900570 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.955791 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.955862 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.955881 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.955914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:03 crc kubenswrapper[4772]: I0930 17:03:03.955935 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:03Z","lastTransitionTime":"2025-09-30T17:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.058992 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.059074 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.059088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.059109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.059124 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:04Z","lastTransitionTime":"2025-09-30T17:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.163263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.163628 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.163717 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.163822 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.163914 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:04Z","lastTransitionTime":"2025-09-30T17:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.267694 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.267757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.267772 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.267795 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.267862 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:04Z","lastTransitionTime":"2025-09-30T17:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.379088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.379168 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.379493 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.379546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.379574 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:04Z","lastTransitionTime":"2025-09-30T17:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.483774 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.483849 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.483870 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.483896 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.483915 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:04Z","lastTransitionTime":"2025-09-30T17:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.588415 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.588473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.588485 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.588505 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.588523 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:04Z","lastTransitionTime":"2025-09-30T17:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.692616 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.692692 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.692717 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.692749 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.692769 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:04Z","lastTransitionTime":"2025-09-30T17:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.797860 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.797913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.797925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.797945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.797956 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:04Z","lastTransitionTime":"2025-09-30T17:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.898103 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:04 crc kubenswrapper[4772]: E0930 17:03:04.898339 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.901973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.902140 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.902176 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.902205 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:04 crc kubenswrapper[4772]: I0930 17:03:04.902224 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:04Z","lastTransitionTime":"2025-09-30T17:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.006464 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.006768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.006860 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.006948 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.007043 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:05Z","lastTransitionTime":"2025-09-30T17:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.110455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.110733 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.110836 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.110912 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.110979 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:05Z","lastTransitionTime":"2025-09-30T17:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.213953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.214454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.214658 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.214860 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.215008 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:05Z","lastTransitionTime":"2025-09-30T17:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.318641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.318758 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.318968 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.318997 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.319019 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:05Z","lastTransitionTime":"2025-09-30T17:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.422703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.422760 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.422773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.422793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.422805 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:05Z","lastTransitionTime":"2025-09-30T17:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.526761 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.526848 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.526867 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.526903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.526922 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:05Z","lastTransitionTime":"2025-09-30T17:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.630929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.631008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.631030 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.631134 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.631170 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:05Z","lastTransitionTime":"2025-09-30T17:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.734280 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.734354 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.734375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.734410 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.734429 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:05Z","lastTransitionTime":"2025-09-30T17:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.837487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.837531 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.837543 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.837564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.837577 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:05Z","lastTransitionTime":"2025-09-30T17:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.897881 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.897951 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:05 crc kubenswrapper[4772]: E0930 17:03:05.898176 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.898273 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:05 crc kubenswrapper[4772]: E0930 17:03:05.898489 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:05 crc kubenswrapper[4772]: E0930 17:03:05.898833 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.940840 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.941397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.941639 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.941864 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:05 crc kubenswrapper[4772]: I0930 17:03:05.942107 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:05Z","lastTransitionTime":"2025-09-30T17:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.045660 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.045761 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.046028 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.046095 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.046118 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:06Z","lastTransitionTime":"2025-09-30T17:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.149427 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.149482 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.149504 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.149528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.149546 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:06Z","lastTransitionTime":"2025-09-30T17:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.252223 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.252275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.252285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.252301 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.252312 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:06Z","lastTransitionTime":"2025-09-30T17:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.355333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.355370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.355379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.355400 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.355409 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:06Z","lastTransitionTime":"2025-09-30T17:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.457443 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.457497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.457509 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.457527 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.457539 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:06Z","lastTransitionTime":"2025-09-30T17:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.561298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.561656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.561804 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.561914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.562010 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:06Z","lastTransitionTime":"2025-09-30T17:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.665658 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.665701 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.665709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.665724 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.665734 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:06Z","lastTransitionTime":"2025-09-30T17:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.768150 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.768496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.768595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.768702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.768810 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:06Z","lastTransitionTime":"2025-09-30T17:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.872161 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.872220 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.872233 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.872254 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.872270 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:06Z","lastTransitionTime":"2025-09-30T17:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.897436 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:06 crc kubenswrapper[4772]: E0930 17:03:06.897690 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.918564 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.975249 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.975308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.975325 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.975342 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:06 crc kubenswrapper[4772]: I0930 17:03:06.975356 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:06Z","lastTransitionTime":"2025-09-30T17:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.078932 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.079002 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.079016 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.079041 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.079069 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:07Z","lastTransitionTime":"2025-09-30T17:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.182355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.182408 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.182420 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.182439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.182452 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:07Z","lastTransitionTime":"2025-09-30T17:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.286300 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.286371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.286385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.286411 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.286428 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:07Z","lastTransitionTime":"2025-09-30T17:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.391889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.391966 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.391979 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.392007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.392023 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:07Z","lastTransitionTime":"2025-09-30T17:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.495541 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.495609 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.495629 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.495662 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.495706 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:07Z","lastTransitionTime":"2025-09-30T17:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.598919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.598983 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.599024 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.599046 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.599092 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:07Z","lastTransitionTime":"2025-09-30T17:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.701442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.701478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.701489 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.701507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.701521 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:07Z","lastTransitionTime":"2025-09-30T17:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.796310 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.797635 4772 scope.go:117] "RemoveContainer" containerID="39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc" Sep 30 17:03:07 crc kubenswrapper[4772]: E0930 17:03:07.797929 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.813391 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.813459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.813479 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.813507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.813526 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:07Z","lastTransitionTime":"2025-09-30T17:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.829382 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j5z7n" podStartSLOduration=67.82934632 podStartE2EDuration="1m7.82934632s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:07.827914402 +0000 UTC m=+88.734927233" watchObservedRunningTime="2025-09-30 17:03:07.82934632 +0000 UTC m=+88.736359181" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.889864 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-47rqk" podStartSLOduration=67.889846126 podStartE2EDuration="1m7.889846126s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:07.889287991 +0000 UTC m=+88.796300812" watchObservedRunningTime="2025-09-30 17:03:07.889846126 +0000 UTC m=+88.796858947" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.897656 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.897656 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.897754 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:07 crc kubenswrapper[4772]: E0930 17:03:07.898021 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:07 crc kubenswrapper[4772]: E0930 17:03:07.897884 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:07 crc kubenswrapper[4772]: E0930 17:03:07.898094 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.902774 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podStartSLOduration=67.902761582 podStartE2EDuration="1m7.902761582s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:07.902629019 +0000 UTC m=+88.809641860" watchObservedRunningTime="2025-09-30 17:03:07.902761582 +0000 UTC m=+88.809774413" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.916650 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.916687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.916698 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.916711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.916722 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:07Z","lastTransitionTime":"2025-09-30T17:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.983962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.984009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.984019 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.984037 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:03:07 crc kubenswrapper[4772]: I0930 17:03:07.984048 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:03:07Z","lastTransitionTime":"2025-09-30T17:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.013818 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jm5rd" podStartSLOduration=67.013792895 podStartE2EDuration="1m7.013792895s" podCreationTimestamp="2025-09-30 17:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:07.997699436 +0000 UTC m=+88.904712277" watchObservedRunningTime="2025-09-30 17:03:08.013792895 +0000 UTC m=+88.920805726" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.014174 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=34.014169815 podStartE2EDuration="34.014169815s" podCreationTimestamp="2025-09-30 17:02:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:08.013741784 +0000 UTC m=+88.920754635" watchObservedRunningTime="2025-09-30 17:03:08.014169815 +0000 UTC m=+88.921182646" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.031111 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k2jvh" podStartSLOduration=68.031076175 podStartE2EDuration="1m8.031076175s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:08.029896074 +0000 UTC m=+88.936908905" watchObservedRunningTime="2025-09-30 17:03:08.031076175 +0000 UTC m=+88.938089006" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.037875 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn"] Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.038484 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.040704 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.040750 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.040866 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.041000 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.050391 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7br52" podStartSLOduration=68.050370288 podStartE2EDuration="1m8.050370288s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:08.049893545 +0000 UTC m=+88.956906386" watchObservedRunningTime="2025-09-30 17:03:08.050370288 +0000 UTC m=+88.957383129" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.063558 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.063536361 podStartE2EDuration="1m8.063536361s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:08.062919505 +0000 UTC m=+88.969932336" watchObservedRunningTime="2025-09-30 17:03:08.063536361 +0000 UTC m=+88.970549202" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.075824 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdc24c5b-ca59-4685-a195-1e09ca18bb60-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.075906 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc24c5b-ca59-4685-a195-1e09ca18bb60-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.075943 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fdc24c5b-ca59-4685-a195-1e09ca18bb60-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.076025 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fdc24c5b-ca59-4685-a195-1e09ca18bb60-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.076079 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fdc24c5b-ca59-4685-a195-1e09ca18bb60-service-ca\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.098523 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=14.098504202 podStartE2EDuration="14.098504202s" podCreationTimestamp="2025-09-30 17:02:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:08.098135652 +0000 UTC m=+89.005148493" watchObservedRunningTime="2025-09-30 17:03:08.098504202 +0000 UTC m=+89.005517033" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.142229 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.14219187 podStartE2EDuration="1m8.14219187s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:08.142004455 +0000 UTC m=+89.049017296" watchObservedRunningTime="2025-09-30 17:03:08.14219187 +0000 UTC m=+89.049204711" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.143735 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=2.14372233 podStartE2EDuration="2.14372233s" podCreationTimestamp="2025-09-30 17:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:08.122115397 +0000 UTC m=+89.029128228" watchObservedRunningTime="2025-09-30 17:03:08.14372233 +0000 UTC m=+89.050735171" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.177239 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdc24c5b-ca59-4685-a195-1e09ca18bb60-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.177308 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc24c5b-ca59-4685-a195-1e09ca18bb60-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.177337 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fdc24c5b-ca59-4685-a195-1e09ca18bb60-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.177392 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fdc24c5b-ca59-4685-a195-1e09ca18bb60-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.177412 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fdc24c5b-ca59-4685-a195-1e09ca18bb60-service-ca\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.177566 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fdc24c5b-ca59-4685-a195-1e09ca18bb60-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.177597 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fdc24c5b-ca59-4685-a195-1e09ca18bb60-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.178365 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fdc24c5b-ca59-4685-a195-1e09ca18bb60-service-ca\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.184652 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc24c5b-ca59-4685-a195-1e09ca18bb60-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.195609 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdc24c5b-ca59-4685-a195-1e09ca18bb60-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-62xmn\" (UID: \"fdc24c5b-ca59-4685-a195-1e09ca18bb60\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.353381 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" Sep 30 17:03:08 crc kubenswrapper[4772]: W0930 17:03:08.370194 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc24c5b_ca59_4685_a195_1e09ca18bb60.slice/crio-934357b403f578a21a2fb3df5387f504aad81f5aa8f30e22c2edda53b7794309 WatchSource:0}: Error finding container 934357b403f578a21a2fb3df5387f504aad81f5aa8f30e22c2edda53b7794309: Status 404 returned error can't find the container with id 934357b403f578a21a2fb3df5387f504aad81f5aa8f30e22c2edda53b7794309 Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.458961 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" event={"ID":"fdc24c5b-ca59-4685-a195-1e09ca18bb60","Type":"ContainerStarted","Data":"934357b403f578a21a2fb3df5387f504aad81f5aa8f30e22c2edda53b7794309"} Sep 30 17:03:08 crc kubenswrapper[4772]: I0930 17:03:08.897758 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:08 crc kubenswrapper[4772]: E0930 17:03:08.897899 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:09 crc kubenswrapper[4772]: I0930 17:03:09.465286 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" event={"ID":"fdc24c5b-ca59-4685-a195-1e09ca18bb60","Type":"ContainerStarted","Data":"e4f90c2777a25c9a91b51640615565fe3b87d54507bb07f0bd781e54f57f6b94"} Sep 30 17:03:09 crc kubenswrapper[4772]: I0930 17:03:09.897997 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:09 crc kubenswrapper[4772]: I0930 17:03:09.898053 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:09 crc kubenswrapper[4772]: I0930 17:03:09.898015 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:09 crc kubenswrapper[4772]: E0930 17:03:09.899287 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:09 crc kubenswrapper[4772]: E0930 17:03:09.899434 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:09 crc kubenswrapper[4772]: E0930 17:03:09.899747 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:10 crc kubenswrapper[4772]: I0930 17:03:10.897457 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:10 crc kubenswrapper[4772]: E0930 17:03:10.897614 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:11 crc kubenswrapper[4772]: I0930 17:03:11.897428 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:11 crc kubenswrapper[4772]: I0930 17:03:11.897449 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:11 crc kubenswrapper[4772]: I0930 17:03:11.897679 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:11 crc kubenswrapper[4772]: E0930 17:03:11.897846 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:11 crc kubenswrapper[4772]: E0930 17:03:11.897917 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:11 crc kubenswrapper[4772]: E0930 17:03:11.898011 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:12 crc kubenswrapper[4772]: I0930 17:03:12.898243 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:12 crc kubenswrapper[4772]: E0930 17:03:12.898418 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:13 crc kubenswrapper[4772]: I0930 17:03:13.898164 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:13 crc kubenswrapper[4772]: I0930 17:03:13.898256 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:13 crc kubenswrapper[4772]: E0930 17:03:13.898479 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:13 crc kubenswrapper[4772]: I0930 17:03:13.898551 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:13 crc kubenswrapper[4772]: E0930 17:03:13.898701 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:13 crc kubenswrapper[4772]: E0930 17:03:13.899102 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:14 crc kubenswrapper[4772]: I0930 17:03:14.897194 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:14 crc kubenswrapper[4772]: E0930 17:03:14.897830 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:15 crc kubenswrapper[4772]: I0930 17:03:15.897526 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:15 crc kubenswrapper[4772]: I0930 17:03:15.897588 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:15 crc kubenswrapper[4772]: E0930 17:03:15.897695 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:15 crc kubenswrapper[4772]: I0930 17:03:15.897770 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:15 crc kubenswrapper[4772]: E0930 17:03:15.897913 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:15 crc kubenswrapper[4772]: E0930 17:03:15.898011 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:16 crc kubenswrapper[4772]: I0930 17:03:16.898043 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:16 crc kubenswrapper[4772]: E0930 17:03:16.898972 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:17 crc kubenswrapper[4772]: I0930 17:03:17.898009 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:17 crc kubenswrapper[4772]: I0930 17:03:17.898018 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:17 crc kubenswrapper[4772]: E0930 17:03:17.898375 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:17 crc kubenswrapper[4772]: E0930 17:03:17.898589 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:17 crc kubenswrapper[4772]: I0930 17:03:17.898758 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:17 crc kubenswrapper[4772]: E0930 17:03:17.899430 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:18 crc kubenswrapper[4772]: I0930 17:03:18.897356 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:18 crc kubenswrapper[4772]: E0930 17:03:18.897910 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:19 crc kubenswrapper[4772]: I0930 17:03:19.306110 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:19 crc kubenswrapper[4772]: E0930 17:03:19.306326 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:03:19 crc kubenswrapper[4772]: E0930 17:03:19.306394 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs podName:0f2541dd-c77d-4bc5-9771-6ac741731464 nodeName:}" failed. No retries permitted until 2025-09-30 17:04:23.306375779 +0000 UTC m=+164.213388610 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs") pod "network-metrics-daemon-wlgc4" (UID: "0f2541dd-c77d-4bc5-9771-6ac741731464") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:03:19 crc kubenswrapper[4772]: I0930 17:03:19.897687 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:19 crc kubenswrapper[4772]: I0930 17:03:19.897747 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:19 crc kubenswrapper[4772]: I0930 17:03:19.897698 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:19 crc kubenswrapper[4772]: E0930 17:03:19.899094 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:19 crc kubenswrapper[4772]: E0930 17:03:19.899226 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:19 crc kubenswrapper[4772]: E0930 17:03:19.899455 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:20 crc kubenswrapper[4772]: I0930 17:03:20.897474 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:20 crc kubenswrapper[4772]: E0930 17:03:20.897682 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:21 crc kubenswrapper[4772]: I0930 17:03:21.898227 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:21 crc kubenswrapper[4772]: I0930 17:03:21.898288 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:21 crc kubenswrapper[4772]: I0930 17:03:21.899487 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:21 crc kubenswrapper[4772]: E0930 17:03:21.899711 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:21 crc kubenswrapper[4772]: E0930 17:03:21.899863 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:21 crc kubenswrapper[4772]: E0930 17:03:21.899945 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:22 crc kubenswrapper[4772]: I0930 17:03:22.898213 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:22 crc kubenswrapper[4772]: E0930 17:03:22.898407 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:22 crc kubenswrapper[4772]: I0930 17:03:22.899300 4772 scope.go:117] "RemoveContainer" containerID="39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc" Sep 30 17:03:22 crc kubenswrapper[4772]: E0930 17:03:22.899477 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" Sep 30 17:03:23 crc kubenswrapper[4772]: I0930 17:03:23.897993 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:23 crc kubenswrapper[4772]: I0930 17:03:23.898192 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:23 crc kubenswrapper[4772]: E0930 17:03:23.898306 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:23 crc kubenswrapper[4772]: I0930 17:03:23.898454 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:23 crc kubenswrapper[4772]: E0930 17:03:23.898627 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:23 crc kubenswrapper[4772]: E0930 17:03:23.898765 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:24 crc kubenswrapper[4772]: I0930 17:03:24.898286 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:24 crc kubenswrapper[4772]: E0930 17:03:24.898520 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:25 crc kubenswrapper[4772]: I0930 17:03:25.897843 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:25 crc kubenswrapper[4772]: I0930 17:03:25.898021 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:25 crc kubenswrapper[4772]: E0930 17:03:25.898160 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:25 crc kubenswrapper[4772]: I0930 17:03:25.898209 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:25 crc kubenswrapper[4772]: E0930 17:03:25.898327 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:25 crc kubenswrapper[4772]: E0930 17:03:25.898497 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:26 crc kubenswrapper[4772]: I0930 17:03:26.897455 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:26 crc kubenswrapper[4772]: E0930 17:03:26.897632 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:27 crc kubenswrapper[4772]: I0930 17:03:27.898146 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:27 crc kubenswrapper[4772]: E0930 17:03:27.898422 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:27 crc kubenswrapper[4772]: I0930 17:03:27.898282 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:27 crc kubenswrapper[4772]: E0930 17:03:27.898515 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:27 crc kubenswrapper[4772]: I0930 17:03:27.898276 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:27 crc kubenswrapper[4772]: E0930 17:03:27.898571 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:28 crc kubenswrapper[4772]: I0930 17:03:28.897222 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:28 crc kubenswrapper[4772]: E0930 17:03:28.897511 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:29 crc kubenswrapper[4772]: I0930 17:03:29.897757 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:29 crc kubenswrapper[4772]: I0930 17:03:29.897823 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:29 crc kubenswrapper[4772]: I0930 17:03:29.897877 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:29 crc kubenswrapper[4772]: E0930 17:03:29.899335 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:29 crc kubenswrapper[4772]: E0930 17:03:29.899477 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:29 crc kubenswrapper[4772]: E0930 17:03:29.899592 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:30 crc kubenswrapper[4772]: I0930 17:03:30.897879 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:30 crc kubenswrapper[4772]: E0930 17:03:30.898138 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:31 crc kubenswrapper[4772]: I0930 17:03:31.897641 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:31 crc kubenswrapper[4772]: I0930 17:03:31.897803 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:31 crc kubenswrapper[4772]: I0930 17:03:31.897841 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:31 crc kubenswrapper[4772]: E0930 17:03:31.897985 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:31 crc kubenswrapper[4772]: E0930 17:03:31.898257 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:31 crc kubenswrapper[4772]: E0930 17:03:31.898455 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:32 crc kubenswrapper[4772]: I0930 17:03:32.898126 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:32 crc kubenswrapper[4772]: E0930 17:03:32.898273 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:33 crc kubenswrapper[4772]: I0930 17:03:33.897916 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:33 crc kubenswrapper[4772]: I0930 17:03:33.898080 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:33 crc kubenswrapper[4772]: E0930 17:03:33.898133 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:33 crc kubenswrapper[4772]: I0930 17:03:33.898191 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:33 crc kubenswrapper[4772]: E0930 17:03:33.898330 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:33 crc kubenswrapper[4772]: E0930 17:03:33.898350 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:34 crc kubenswrapper[4772]: I0930 17:03:34.898046 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:34 crc kubenswrapper[4772]: E0930 17:03:34.898235 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:35 crc kubenswrapper[4772]: I0930 17:03:35.554487 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7br52_5e5b90d4-3f5e-49d8-b2c5-175948eeeda6/kube-multus/1.log" Sep 30 17:03:35 crc kubenswrapper[4772]: I0930 17:03:35.555125 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7br52_5e5b90d4-3f5e-49d8-b2c5-175948eeeda6/kube-multus/0.log" Sep 30 17:03:35 crc kubenswrapper[4772]: I0930 17:03:35.555193 4772 generic.go:334] "Generic (PLEG): container finished" podID="5e5b90d4-3f5e-49d8-b2c5-175948eeeda6" containerID="8ef1189b32001cded42b3c4fd17f81a9c4075e8b0f54d72799fa4306e83cd670" exitCode=1 Sep 30 17:03:35 crc kubenswrapper[4772]: I0930 17:03:35.555229 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7br52" event={"ID":"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6","Type":"ContainerDied","Data":"8ef1189b32001cded42b3c4fd17f81a9c4075e8b0f54d72799fa4306e83cd670"} Sep 30 17:03:35 crc kubenswrapper[4772]: I0930 17:03:35.555268 4772 scope.go:117] "RemoveContainer" containerID="6ba2f60e9aef7b803cf7f8b10036c4606967f40ec8b24a9fb85687cf8a5fd2f1" Sep 30 17:03:35 crc kubenswrapper[4772]: I0930 17:03:35.555719 4772 scope.go:117] "RemoveContainer" containerID="8ef1189b32001cded42b3c4fd17f81a9c4075e8b0f54d72799fa4306e83cd670" Sep 30 17:03:35 crc kubenswrapper[4772]: E0930 17:03:35.555925 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7br52_openshift-multus(5e5b90d4-3f5e-49d8-b2c5-175948eeeda6)\"" pod="openshift-multus/multus-7br52" podUID="5e5b90d4-3f5e-49d8-b2c5-175948eeeda6" Sep 30 17:03:35 crc kubenswrapper[4772]: I0930 17:03:35.578678 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-62xmn" podStartSLOduration=95.578659856 podStartE2EDuration="1m35.578659856s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:09.4849329 +0000 UTC m=+90.391945731" watchObservedRunningTime="2025-09-30 17:03:35.578659856 +0000 UTC m=+116.485672707" Sep 30 17:03:35 crc kubenswrapper[4772]: I0930 17:03:35.897442 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:35 crc kubenswrapper[4772]: I0930 17:03:35.897513 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:35 crc kubenswrapper[4772]: E0930 17:03:35.897665 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:35 crc kubenswrapper[4772]: I0930 17:03:35.897704 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:35 crc kubenswrapper[4772]: E0930 17:03:35.897855 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:35 crc kubenswrapper[4772]: E0930 17:03:35.897960 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:36 crc kubenswrapper[4772]: I0930 17:03:36.559886 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7br52_5e5b90d4-3f5e-49d8-b2c5-175948eeeda6/kube-multus/1.log" Sep 30 17:03:36 crc kubenswrapper[4772]: I0930 17:03:36.897997 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:36 crc kubenswrapper[4772]: E0930 17:03:36.898221 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:37 crc kubenswrapper[4772]: I0930 17:03:37.898037 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:37 crc kubenswrapper[4772]: I0930 17:03:37.898099 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:37 crc kubenswrapper[4772]: I0930 17:03:37.898190 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:37 crc kubenswrapper[4772]: E0930 17:03:37.898235 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:37 crc kubenswrapper[4772]: E0930 17:03:37.898424 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:37 crc kubenswrapper[4772]: E0930 17:03:37.898487 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:37 crc kubenswrapper[4772]: I0930 17:03:37.899349 4772 scope.go:117] "RemoveContainer" containerID="39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc" Sep 30 17:03:37 crc kubenswrapper[4772]: E0930 17:03:37.899597 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bj99l_openshift-ovn-kubernetes(47daa5db-853e-45af-98ae-489980c97641)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" Sep 30 17:03:38 crc kubenswrapper[4772]: I0930 17:03:38.897534 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:38 crc kubenswrapper[4772]: E0930 17:03:38.897797 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:39 crc kubenswrapper[4772]: E0930 17:03:39.833365 4772 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 17:03:39 crc kubenswrapper[4772]: I0930 17:03:39.897747 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:39 crc kubenswrapper[4772]: I0930 17:03:39.897893 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:39 crc kubenswrapper[4772]: I0930 17:03:39.897907 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:39 crc kubenswrapper[4772]: E0930 17:03:39.899198 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:39 crc kubenswrapper[4772]: E0930 17:03:39.899628 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:39 crc kubenswrapper[4772]: E0930 17:03:39.899834 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:40 crc kubenswrapper[4772]: E0930 17:03:40.026134 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:03:40 crc kubenswrapper[4772]: I0930 17:03:40.897563 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:40 crc kubenswrapper[4772]: E0930 17:03:40.897794 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:41 crc kubenswrapper[4772]: I0930 17:03:41.897549 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:41 crc kubenswrapper[4772]: I0930 17:03:41.897567 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:41 crc kubenswrapper[4772]: E0930 17:03:41.897744 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:41 crc kubenswrapper[4772]: I0930 17:03:41.897809 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:41 crc kubenswrapper[4772]: E0930 17:03:41.897958 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:41 crc kubenswrapper[4772]: E0930 17:03:41.898000 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:42 crc kubenswrapper[4772]: I0930 17:03:42.897520 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:42 crc kubenswrapper[4772]: E0930 17:03:42.897714 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:43 crc kubenswrapper[4772]: I0930 17:03:43.897640 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:43 crc kubenswrapper[4772]: I0930 17:03:43.897686 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:43 crc kubenswrapper[4772]: E0930 17:03:43.898282 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:43 crc kubenswrapper[4772]: I0930 17:03:43.897780 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:43 crc kubenswrapper[4772]: E0930 17:03:43.898451 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:43 crc kubenswrapper[4772]: E0930 17:03:43.898518 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:44 crc kubenswrapper[4772]: I0930 17:03:44.897615 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:44 crc kubenswrapper[4772]: E0930 17:03:44.897847 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:45 crc kubenswrapper[4772]: E0930 17:03:45.027745 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:03:45 crc kubenswrapper[4772]: I0930 17:03:45.898309 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:45 crc kubenswrapper[4772]: E0930 17:03:45.898571 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:45 crc kubenswrapper[4772]: I0930 17:03:45.898909 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:45 crc kubenswrapper[4772]: E0930 17:03:45.899045 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:45 crc kubenswrapper[4772]: I0930 17:03:45.899338 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:45 crc kubenswrapper[4772]: E0930 17:03:45.899469 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:46 crc kubenswrapper[4772]: I0930 17:03:46.898210 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:46 crc kubenswrapper[4772]: E0930 17:03:46.898538 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:47 crc kubenswrapper[4772]: I0930 17:03:47.897200 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:47 crc kubenswrapper[4772]: E0930 17:03:47.897413 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:47 crc kubenswrapper[4772]: I0930 17:03:47.897491 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:47 crc kubenswrapper[4772]: I0930 17:03:47.897515 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:47 crc kubenswrapper[4772]: E0930 17:03:47.897627 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:47 crc kubenswrapper[4772]: E0930 17:03:47.897793 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:48 crc kubenswrapper[4772]: I0930 17:03:48.898229 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:48 crc kubenswrapper[4772]: E0930 17:03:48.898429 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:49 crc kubenswrapper[4772]: I0930 17:03:49.897898 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:49 crc kubenswrapper[4772]: I0930 17:03:49.898091 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:49 crc kubenswrapper[4772]: I0930 17:03:49.898919 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:49 crc kubenswrapper[4772]: E0930 17:03:49.899310 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:49 crc kubenswrapper[4772]: E0930 17:03:49.899492 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:49 crc kubenswrapper[4772]: I0930 17:03:49.899529 4772 scope.go:117] "RemoveContainer" containerID="8ef1189b32001cded42b3c4fd17f81a9c4075e8b0f54d72799fa4306e83cd670" Sep 30 17:03:49 crc kubenswrapper[4772]: E0930 17:03:49.899775 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:50 crc kubenswrapper[4772]: E0930 17:03:50.028993 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:03:50 crc kubenswrapper[4772]: I0930 17:03:50.625191 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7br52_5e5b90d4-3f5e-49d8-b2c5-175948eeeda6/kube-multus/1.log" Sep 30 17:03:50 crc kubenswrapper[4772]: I0930 17:03:50.625293 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7br52" event={"ID":"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6","Type":"ContainerStarted","Data":"dd0542a8a6e1f74fa0a0bfda28a793973346f624d7bfe562855a5502e5c9ce83"} Sep 30 17:03:50 crc kubenswrapper[4772]: I0930 17:03:50.897970 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:50 crc kubenswrapper[4772]: E0930 17:03:50.898167 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:51 crc kubenswrapper[4772]: I0930 17:03:51.897692 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:51 crc kubenswrapper[4772]: I0930 17:03:51.897703 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:51 crc kubenswrapper[4772]: I0930 17:03:51.897740 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:51 crc kubenswrapper[4772]: E0930 17:03:51.898115 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:51 crc kubenswrapper[4772]: E0930 17:03:51.898301 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:51 crc kubenswrapper[4772]: E0930 17:03:51.898441 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:51 crc kubenswrapper[4772]: I0930 17:03:51.898543 4772 scope.go:117] "RemoveContainer" containerID="39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc" Sep 30 17:03:52 crc kubenswrapper[4772]: I0930 17:03:52.634754 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/3.log" Sep 30 17:03:52 crc kubenswrapper[4772]: I0930 17:03:52.637919 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerStarted","Data":"8349cf0ac4454fe23d9f83ac717bce1f5de2645c6ceda50c1052f259339b3be3"} Sep 30 17:03:52 crc kubenswrapper[4772]: I0930 17:03:52.638540 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:03:52 crc kubenswrapper[4772]: I0930 17:03:52.669411 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podStartSLOduration=112.669384966 podStartE2EDuration="1m52.669384966s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:03:52.665524705 +0000 UTC m=+133.572537576" watchObservedRunningTime="2025-09-30 17:03:52.669384966 +0000 UTC m=+133.576397817" Sep 30 17:03:52 crc kubenswrapper[4772]: I0930 17:03:52.797903 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wlgc4"] Sep 30 17:03:52 crc kubenswrapper[4772]: I0930 17:03:52.798049 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:52 crc kubenswrapper[4772]: E0930 17:03:52.798165 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:53 crc kubenswrapper[4772]: I0930 17:03:53.897604 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:53 crc kubenswrapper[4772]: I0930 17:03:53.897712 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:53 crc kubenswrapper[4772]: I0930 17:03:53.897712 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:53 crc kubenswrapper[4772]: I0930 17:03:53.897597 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:53 crc kubenswrapper[4772]: E0930 17:03:53.897851 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:53 crc kubenswrapper[4772]: E0930 17:03:53.898042 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:53 crc kubenswrapper[4772]: E0930 17:03:53.898276 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:53 crc kubenswrapper[4772]: E0930 17:03:53.898380 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:55 crc kubenswrapper[4772]: E0930 17:03:55.030569 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:03:55 crc kubenswrapper[4772]: I0930 17:03:55.897435 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:55 crc kubenswrapper[4772]: I0930 17:03:55.897551 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:55 crc kubenswrapper[4772]: E0930 17:03:55.897659 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:55 crc kubenswrapper[4772]: I0930 17:03:55.897700 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:55 crc kubenswrapper[4772]: I0930 17:03:55.897699 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:55 crc kubenswrapper[4772]: E0930 17:03:55.897900 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:55 crc kubenswrapper[4772]: E0930 17:03:55.898095 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:55 crc kubenswrapper[4772]: E0930 17:03:55.898339 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:57 crc kubenswrapper[4772]: I0930 17:03:57.897922 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:57 crc kubenswrapper[4772]: I0930 17:03:57.898035 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:57 crc kubenswrapper[4772]: I0930 17:03:57.899349 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:57 crc kubenswrapper[4772]: E0930 17:03:57.899575 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:03:57 crc kubenswrapper[4772]: E0930 17:03:57.899909 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:57 crc kubenswrapper[4772]: E0930 17:03:57.899985 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:57 crc kubenswrapper[4772]: I0930 17:03:57.900553 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:57 crc kubenswrapper[4772]: E0930 17:03:57.901771 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:59 crc kubenswrapper[4772]: I0930 17:03:59.897949 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:03:59 crc kubenswrapper[4772]: I0930 17:03:59.898131 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:03:59 crc kubenswrapper[4772]: I0930 17:03:59.898130 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:03:59 crc kubenswrapper[4772]: I0930 17:03:59.900138 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:03:59 crc kubenswrapper[4772]: E0930 17:03:59.900116 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:03:59 crc kubenswrapper[4772]: E0930 17:03:59.900321 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:03:59 crc kubenswrapper[4772]: E0930 17:03:59.900378 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:03:59 crc kubenswrapper[4772]: E0930 17:03:59.900568 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlgc4" podUID="0f2541dd-c77d-4bc5-9771-6ac741731464" Sep 30 17:04:01 crc kubenswrapper[4772]: I0930 17:04:01.897457 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:04:01 crc kubenswrapper[4772]: I0930 17:04:01.897552 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:04:01 crc kubenswrapper[4772]: I0930 17:04:01.897588 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:04:01 crc kubenswrapper[4772]: I0930 17:04:01.897703 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:04:01 crc kubenswrapper[4772]: I0930 17:04:01.902663 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 17:04:01 crc kubenswrapper[4772]: I0930 17:04:01.902821 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 17:04:01 crc kubenswrapper[4772]: I0930 17:04:01.902870 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 17:04:01 crc kubenswrapper[4772]: I0930 17:04:01.902939 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 17:04:01 crc kubenswrapper[4772]: I0930 17:04:01.902951 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 17:04:01 crc kubenswrapper[4772]: I0930 17:04:01.903333 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.817474 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.891766 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.891925 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.892026 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.892138 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.892180 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:04:07 crc kubenswrapper[4772]: E0930 17:04:07.893006 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:06:09.892980474 +0000 UTC m=+270.799993335 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.895316 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.899749 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.900330 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.902465 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.927276 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.944391 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:04:07 crc kubenswrapper[4772]: I0930 17:04:07.970885 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:04:08 crc kubenswrapper[4772]: W0930 17:04:08.200175 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-e9a9d487dd1e23e70397d4a39bc7dd6c61897d313525f1170cd3539642ed7957 WatchSource:0}: Error finding container e9a9d487dd1e23e70397d4a39bc7dd6c61897d313525f1170cd3539642ed7957: Status 404 returned error can't find the container with id e9a9d487dd1e23e70397d4a39bc7dd6c61897d313525f1170cd3539642ed7957 Sep 30 17:04:08 crc kubenswrapper[4772]: W0930 17:04:08.366741 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-602d852cd0d7230fad4de2d382acfa27541b063c70952f656a25fc6f8b79fe39 WatchSource:0}: Error finding container 602d852cd0d7230fad4de2d382acfa27541b063c70952f656a25fc6f8b79fe39: Status 404 returned error can't find the container with id 602d852cd0d7230fad4de2d382acfa27541b063c70952f656a25fc6f8b79fe39 Sep 30 17:04:08 crc kubenswrapper[4772]: I0930 17:04:08.656371 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:04:08 crc kubenswrapper[4772]: I0930 17:04:08.656500 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:04:08 crc kubenswrapper[4772]: I0930 17:04:08.704618 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"821c689956fb37f7912c22537673e777ed25c778b8c32a81e637257a0f1a1d57"} Sep 30 17:04:08 crc kubenswrapper[4772]: I0930 17:04:08.704666 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9da1230c69061c2257c5bc6774d4f0410f59fe8c6dee1bb5c3250b228706fccc"} Sep 30 17:04:08 crc kubenswrapper[4772]: I0930 17:04:08.707321 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fa80035350601dd267b10c36772fc310d6f70937e96f7904642a8d51f3361561"} Sep 30 17:04:08 crc kubenswrapper[4772]: I0930 17:04:08.707366 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e9a9d487dd1e23e70397d4a39bc7dd6c61897d313525f1170cd3539642ed7957"} Sep 30 17:04:08 crc kubenswrapper[4772]: I0930 17:04:08.710218 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c77ec7fb1e20130779723a095bb0354d1328ad0b9a3549625206b79abd273e5a"} Sep 30 17:04:08 crc kubenswrapper[4772]: I0930 17:04:08.710241 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"602d852cd0d7230fad4de2d382acfa27541b063c70952f656a25fc6f8b79fe39"} Sep 30 17:04:08 crc kubenswrapper[4772]: I0930 17:04:08.710372 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.230236 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.273905 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.274493 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.285251 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.288201 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.288626 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.288669 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.288757 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.288863 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.288918 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.289392 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.289443 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.292917 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.294455 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jnm2b"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.295624 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-klzl8"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.296263 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.296736 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.297320 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.297824 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.298129 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wlsdw"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.298549 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r4gqw"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.298886 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.298885 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.300110 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8rwzz"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.300642 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.302074 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.303107 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ppvk7"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.309380 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.315577 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.316354 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.324659 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.325675 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.327234 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ppvk7" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.328204 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.329113 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.339188 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.340381 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.340535 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.340809 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.340962 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.342730 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.342909 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.343538 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.344083 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-tm4sk"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.344450 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.345477 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.347300 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.347814 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.348826 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.348893 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349284 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349315 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349418 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349435 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349538 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349672 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349755 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349792 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349862 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349973 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.349980 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350141 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350210 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350422 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350480 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350570 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350656 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350664 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350733 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350777 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350807 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.350937 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.351164 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.351302 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.351540 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.351675 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.351685 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.351758 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.351822 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.351949 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.367213 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.368031 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.369366 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.369696 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.369790 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.369903 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.370003 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.370283 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.370444 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.370586 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.370707 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.370892 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.370906 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.371080 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.371212 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.371238 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.371347 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.371431 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.372498 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.372574 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.373232 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.373396 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.373724 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.373829 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.374148 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.375044 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.375712 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.376326 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.376666 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-n4pdp"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.377298 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.377776 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-z2t2t"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.378460 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.383005 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.383243 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.383400 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.383489 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.383610 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.384222 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.384418 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.384583 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.384623 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.384904 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.408046 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.410870 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.410911 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxplw\" (UniqueName: \"kubernetes.io/projected/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-kube-api-access-sxplw\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.410942 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8f3d8283-7857-4e35-8cf6-bbec3d0e767e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f8cxm\" (UID: \"8f3d8283-7857-4e35-8cf6-bbec3d0e767e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.410966 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-image-import-ca\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.410992 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/129e39d2-1f26-4919-b1d3-70597defd1c8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5mt\" (UID: \"129e39d2-1f26-4919-b1d3-70597defd1c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411022 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-service-ca\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411041 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f28a4e98-7805-44af-9ba1-4143a95625c5-auth-proxy-config\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411080 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411107 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-oauth-config\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411129 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411150 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29905da-24ff-4cf7-93f2-4b20a1a9b934-config\") pod \"kube-controller-manager-operator-78b949d7b-swt8h\" (UID: \"d29905da-24ff-4cf7-93f2-4b20a1a9b934\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411172 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411193 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fafebd4f-5889-4b08-9e9f-0192504348c9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tgc78\" (UID: \"fafebd4f-5889-4b08-9e9f-0192504348c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411221 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q78b\" (UniqueName: \"kubernetes.io/projected/211ab76e-6958-4f86-9549-06542e81a3e7-kube-api-access-5q78b\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411241 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-serving-cert\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411260 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-etcd-serving-ca\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411282 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjxp\" (UniqueName: \"kubernetes.io/projected/056fc2a2-f5db-4887-bada-a7215edd00d4-kube-api-access-fjjxp\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411302 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-trusted-ca-bundle\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411330 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f033de-0c77-4c9e-bd73-873fe5ecce6c-serving-cert\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411348 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3d8283-7857-4e35-8cf6-bbec3d0e767e-serving-cert\") pod \"openshift-config-operator-7777fb866f-f8cxm\" (UID: \"8f3d8283-7857-4e35-8cf6-bbec3d0e767e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411340 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2gtql"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411395 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h9x8\" (UniqueName: \"kubernetes.io/projected/129e39d2-1f26-4919-b1d3-70597defd1c8-kube-api-access-9h9x8\") pod \"cluster-samples-operator-665b6dd947-kq5mt\" (UID: \"129e39d2-1f26-4919-b1d3-70597defd1c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411431 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-audit\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411454 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49321d19-b839-494f-a4f2-5505fb7ad9ab-serving-cert\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411475 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m9vl\" (UniqueName: \"kubernetes.io/projected/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-kube-api-access-4m9vl\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411499 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-oauth-serving-cert\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411525 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsqnm\" (UniqueName: \"kubernetes.io/projected/fafebd4f-5889-4b08-9e9f-0192504348c9-kube-api-access-bsqnm\") pod \"openshift-apiserver-operator-796bbdcf4f-tgc78\" (UID: \"fafebd4f-5889-4b08-9e9f-0192504348c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411547 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49321d19-b839-494f-a4f2-5505fb7ad9ab-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411574 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211ab76e-6958-4f86-9549-06542e81a3e7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411594 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f28a4e98-7805-44af-9ba1-4143a95625c5-machine-approver-tls\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411613 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4z5w\" (UniqueName: \"kubernetes.io/projected/f28a4e98-7805-44af-9ba1-4143a95625c5-kube-api-access-j4z5w\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411634 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-client-ca\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411658 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llxxx\" (UniqueName: \"kubernetes.io/projected/2bda8593-604a-4bf9-9cd1-0d56310dd0f0-kube-api-access-llxxx\") pod \"downloads-7954f5f757-ppvk7\" (UID: \"2bda8593-604a-4bf9-9cd1-0d56310dd0f0\") " pod="openshift-console/downloads-7954f5f757-ppvk7" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411679 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/056fc2a2-f5db-4887-bada-a7215edd00d4-audit-dir\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411699 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzgwx\" (UniqueName: \"kubernetes.io/projected/b023c669-cb19-4010-b9d7-120bdfff87bd-kube-api-access-tzgwx\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411718 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7rq4\" (UniqueName: \"kubernetes.io/projected/49321d19-b839-494f-a4f2-5505fb7ad9ab-kube-api-access-j7rq4\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411739 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-audit-policies\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411760 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b023c669-cb19-4010-b9d7-120bdfff87bd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411775 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28a4e98-7805-44af-9ba1-4143a95625c5-config\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411794 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411816 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-serving-cert\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411836 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p65hm\" (UniqueName: \"kubernetes.io/projected/44f033de-0c77-4c9e-bd73-873fe5ecce6c-kube-api-access-p65hm\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411858 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411878 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-config\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411906 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/63eeced4-9c90-46e5-9234-938f88df7c49-audit-dir\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411945 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411966 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.411992 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x5tt\" (UniqueName: \"kubernetes.io/projected/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-kube-api-access-9x5tt\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412013 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b023c669-cb19-4010-b9d7-120bdfff87bd-config\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412034 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fafebd4f-5889-4b08-9e9f-0192504348c9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tgc78\" (UID: \"fafebd4f-5889-4b08-9e9f-0192504348c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412052 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412097 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f033de-0c77-4c9e-bd73-873fe5ecce6c-config\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412119 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412141 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/211ab76e-6958-4f86-9549-06542e81a3e7-encryption-config\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412157 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412180 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412207 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b023c669-cb19-4010-b9d7-120bdfff87bd-images\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412228 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/056fc2a2-f5db-4887-bada-a7215edd00d4-node-pullsecrets\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412268 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412286 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/211ab76e-6958-4f86-9549-06542e81a3e7-audit-policies\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412305 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/211ab76e-6958-4f86-9549-06542e81a3e7-audit-dir\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412333 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msvm5\" (UniqueName: \"kubernetes.io/projected/8f3d8283-7857-4e35-8cf6-bbec3d0e767e-kube-api-access-msvm5\") pod \"openshift-config-operator-7777fb866f-f8cxm\" (UID: \"8f3d8283-7857-4e35-8cf6-bbec3d0e767e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412356 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/056fc2a2-f5db-4887-bada-a7215edd00d4-etcd-client\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412375 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/056fc2a2-f5db-4887-bada-a7215edd00d4-serving-cert\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412397 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d29905da-24ff-4cf7-93f2-4b20a1a9b934-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-swt8h\" (UID: \"d29905da-24ff-4cf7-93f2-4b20a1a9b934\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412420 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49321d19-b839-494f-a4f2-5505fb7ad9ab-config\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412442 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/211ab76e-6958-4f86-9549-06542e81a3e7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412461 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-config\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412484 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211ab76e-6958-4f86-9549-06542e81a3e7-serving-cert\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412508 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/056fc2a2-f5db-4887-bada-a7215edd00d4-encryption-config\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412528 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412549 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-config\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412566 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49321d19-b839-494f-a4f2-5505fb7ad9ab-service-ca-bundle\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412587 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r8df\" (UniqueName: \"kubernetes.io/projected/63eeced4-9c90-46e5-9234-938f88df7c49-kube-api-access-2r8df\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412606 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/211ab76e-6958-4f86-9549-06542e81a3e7-etcd-client\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412629 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d29905da-24ff-4cf7-93f2-4b20a1a9b934-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-swt8h\" (UID: \"d29905da-24ff-4cf7-93f2-4b20a1a9b934\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412647 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44f033de-0c77-4c9e-bd73-873fe5ecce6c-trusted-ca\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.412697 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.419914 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.420393 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.420777 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.420916 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.421175 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.421255 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.421177 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.421435 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.421544 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.422564 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.422974 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.423935 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.424139 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.426097 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.434873 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.435454 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.440400 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ghsks"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.442003 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.442896 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.443607 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.445567 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.446750 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.446937 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.447209 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.447321 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.448566 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dt94w"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.448814 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.448988 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.449191 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.450464 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.450948 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.451663 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.451682 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.452149 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.452263 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.452152 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.454666 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.457708 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.460654 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.461353 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.461724 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-sbg9r"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.462537 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.462640 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.462886 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.463507 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.463636 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.464099 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.466620 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.467280 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.467632 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.468129 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.472024 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.476681 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tbscw"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.477818 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-grlbs"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.478372 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.479026 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.479193 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.483116 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.484241 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.485173 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.486257 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wlsdw"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.486379 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.487516 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.490533 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r4gqw"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.493016 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-klzl8"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.499199 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.499289 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8rwzz"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.502176 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.504513 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.505252 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-tqlhz"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.510799 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.510724 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.511862 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tm4sk"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513544 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513588 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-oauth-config\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513613 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513644 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-serving-cert\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513673 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29905da-24ff-4cf7-93f2-4b20a1a9b934-config\") pod \"kube-controller-manager-operator-78b949d7b-swt8h\" (UID: \"d29905da-24ff-4cf7-93f2-4b20a1a9b934\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513704 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513731 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fafebd4f-5889-4b08-9e9f-0192504348c9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tgc78\" (UID: \"fafebd4f-5889-4b08-9e9f-0192504348c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513755 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q78b\" (UniqueName: \"kubernetes.io/projected/211ab76e-6958-4f86-9549-06542e81a3e7-kube-api-access-5q78b\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513798 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-etcd-serving-ca\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513832 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjxp\" (UniqueName: \"kubernetes.io/projected/056fc2a2-f5db-4887-bada-a7215edd00d4-kube-api-access-fjjxp\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513859 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-trusted-ca-bundle\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513892 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f033de-0c77-4c9e-bd73-873fe5ecce6c-serving-cert\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.513923 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3d8283-7857-4e35-8cf6-bbec3d0e767e-serving-cert\") pod \"openshift-config-operator-7777fb866f-f8cxm\" (UID: \"8f3d8283-7857-4e35-8cf6-bbec3d0e767e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.514175 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h9x8\" (UniqueName: \"kubernetes.io/projected/129e39d2-1f26-4919-b1d3-70597defd1c8-kube-api-access-9h9x8\") pod \"cluster-samples-operator-665b6dd947-kq5mt\" (UID: \"129e39d2-1f26-4919-b1d3-70597defd1c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.514215 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-audit\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.514249 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49321d19-b839-494f-a4f2-5505fb7ad9ab-serving-cert\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.514272 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m9vl\" (UniqueName: \"kubernetes.io/projected/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-kube-api-access-4m9vl\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.515785 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-etcd-serving-ca\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.515972 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-oauth-serving-cert\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.516560 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29905da-24ff-4cf7-93f2-4b20a1a9b934-config\") pod \"kube-controller-manager-operator-78b949d7b-swt8h\" (UID: \"d29905da-24ff-4cf7-93f2-4b20a1a9b934\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.518989 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ghsks"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.519224 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.519219 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.520341 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsqnm\" (UniqueName: \"kubernetes.io/projected/fafebd4f-5889-4b08-9e9f-0192504348c9-kube-api-access-bsqnm\") pod \"openshift-apiserver-operator-796bbdcf4f-tgc78\" (UID: \"fafebd4f-5889-4b08-9e9f-0192504348c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.520408 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49321d19-b839-494f-a4f2-5505fb7ad9ab-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.520656 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-audit\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.521827 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211ab76e-6958-4f86-9549-06542e81a3e7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.521902 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f28a4e98-7805-44af-9ba1-4143a95625c5-machine-approver-tls\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.521961 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4z5w\" (UniqueName: \"kubernetes.io/projected/f28a4e98-7805-44af-9ba1-4143a95625c5-kube-api-access-j4z5w\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.521995 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-client-ca\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522068 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llxxx\" (UniqueName: \"kubernetes.io/projected/2bda8593-604a-4bf9-9cd1-0d56310dd0f0-kube-api-access-llxxx\") pod \"downloads-7954f5f757-ppvk7\" (UID: \"2bda8593-604a-4bf9-9cd1-0d56310dd0f0\") " pod="openshift-console/downloads-7954f5f757-ppvk7" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522097 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzgwx\" (UniqueName: \"kubernetes.io/projected/b023c669-cb19-4010-b9d7-120bdfff87bd-kube-api-access-tzgwx\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522131 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/056fc2a2-f5db-4887-bada-a7215edd00d4-audit-dir\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522141 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-oauth-serving-cert\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522165 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7rq4\" (UniqueName: \"kubernetes.io/projected/49321d19-b839-494f-a4f2-5505fb7ad9ab-kube-api-access-j7rq4\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522218 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-audit-policies\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522251 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b023c669-cb19-4010-b9d7-120bdfff87bd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522281 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28a4e98-7805-44af-9ba1-4143a95625c5-config\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522315 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522350 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p65hm\" (UniqueName: \"kubernetes.io/projected/44f033de-0c77-4c9e-bd73-873fe5ecce6c-kube-api-access-p65hm\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522377 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-serving-cert\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522405 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522433 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-config\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522462 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522486 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/63eeced4-9c90-46e5-9234-938f88df7c49-audit-dir\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522541 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522570 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522600 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x5tt\" (UniqueName: \"kubernetes.io/projected/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-kube-api-access-9x5tt\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522629 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522630 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522655 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b023c669-cb19-4010-b9d7-120bdfff87bd-config\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522683 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fafebd4f-5889-4b08-9e9f-0192504348c9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tgc78\" (UID: \"fafebd4f-5889-4b08-9e9f-0192504348c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522718 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522748 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f033de-0c77-4c9e-bd73-873fe5ecce6c-config\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522783 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/211ab76e-6958-4f86-9549-06542e81a3e7-encryption-config\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522809 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522841 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522874 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b023c669-cb19-4010-b9d7-120bdfff87bd-images\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522908 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/056fc2a2-f5db-4887-bada-a7215edd00d4-node-pullsecrets\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522951 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.522975 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49321d19-b839-494f-a4f2-5505fb7ad9ab-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.523027 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/211ab76e-6958-4f86-9549-06542e81a3e7-audit-policies\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.523076 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/211ab76e-6958-4f86-9549-06542e81a3e7-audit-dir\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.523107 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msvm5\" (UniqueName: \"kubernetes.io/projected/8f3d8283-7857-4e35-8cf6-bbec3d0e767e-kube-api-access-msvm5\") pod \"openshift-config-operator-7777fb866f-f8cxm\" (UID: \"8f3d8283-7857-4e35-8cf6-bbec3d0e767e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.523132 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/056fc2a2-f5db-4887-bada-a7215edd00d4-etcd-client\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.523156 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/056fc2a2-f5db-4887-bada-a7215edd00d4-serving-cert\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.523184 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d29905da-24ff-4cf7-93f2-4b20a1a9b934-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-swt8h\" (UID: \"d29905da-24ff-4cf7-93f2-4b20a1a9b934\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.523225 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49321d19-b839-494f-a4f2-5505fb7ad9ab-config\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.523251 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/211ab76e-6958-4f86-9549-06542e81a3e7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.523278 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-config\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.523958 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524109 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211ab76e-6958-4f86-9549-06542e81a3e7-serving-cert\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524177 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/056fc2a2-f5db-4887-bada-a7215edd00d4-encryption-config\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524375 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-config\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524403 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49321d19-b839-494f-a4f2-5505fb7ad9ab-service-ca-bundle\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524467 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r8df\" (UniqueName: \"kubernetes.io/projected/63eeced4-9c90-46e5-9234-938f88df7c49-kube-api-access-2r8df\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524483 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-serving-cert\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524499 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/211ab76e-6958-4f86-9549-06542e81a3e7-etcd-client\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524652 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d29905da-24ff-4cf7-93f2-4b20a1a9b934-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-swt8h\" (UID: \"d29905da-24ff-4cf7-93f2-4b20a1a9b934\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524690 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44f033de-0c77-4c9e-bd73-873fe5ecce6c-trusted-ca\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524747 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524785 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxplw\" (UniqueName: \"kubernetes.io/projected/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-kube-api-access-sxplw\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524817 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8f3d8283-7857-4e35-8cf6-bbec3d0e767e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f8cxm\" (UID: \"8f3d8283-7857-4e35-8cf6-bbec3d0e767e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524837 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-image-import-ca\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524875 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/129e39d2-1f26-4919-b1d3-70597defd1c8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5mt\" (UID: \"129e39d2-1f26-4919-b1d3-70597defd1c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524909 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-service-ca\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.524942 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f28a4e98-7805-44af-9ba1-4143a95625c5-auth-proxy-config\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.525001 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/056fc2a2-f5db-4887-bada-a7215edd00d4-audit-dir\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.525108 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-config\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.525920 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f28a4e98-7805-44af-9ba1-4143a95625c5-auth-proxy-config\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.525945 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-audit-policies\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.526687 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211ab76e-6958-4f86-9549-06542e81a3e7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.526885 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-client-ca\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.527210 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-oauth-config\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.528036 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-trusted-ca-bundle\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.528905 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-config\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.529667 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28a4e98-7805-44af-9ba1-4143a95625c5-config\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.530440 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.530831 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d29905da-24ff-4cf7-93f2-4b20a1a9b934-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-swt8h\" (UID: \"d29905da-24ff-4cf7-93f2-4b20a1a9b934\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.530925 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2gtql"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.533547 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8f3d8283-7857-4e35-8cf6-bbec3d0e767e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f8cxm\" (UID: \"8f3d8283-7857-4e35-8cf6-bbec3d0e767e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.533624 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44f033de-0c77-4c9e-bd73-873fe5ecce6c-trusted-ca\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.533740 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49321d19-b839-494f-a4f2-5505fb7ad9ab-service-ca-bundle\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.533855 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/056fc2a2-f5db-4887-bada-a7215edd00d4-encryption-config\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.534343 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49321d19-b839-494f-a4f2-5505fb7ad9ab-serving-cert\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.535212 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-image-import-ca\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.535609 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b023c669-cb19-4010-b9d7-120bdfff87bd-images\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.535673 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/056fc2a2-f5db-4887-bada-a7215edd00d4-node-pullsecrets\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.536477 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f28a4e98-7805-44af-9ba1-4143a95625c5-machine-approver-tls\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.536646 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-service-ca\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.536759 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.536905 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.536965 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/63eeced4-9c90-46e5-9234-938f88df7c49-audit-dir\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.537449 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/211ab76e-6958-4f86-9549-06542e81a3e7-audit-policies\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.537492 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.537543 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/211ab76e-6958-4f86-9549-06542e81a3e7-audit-dir\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.537793 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.538449 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49321d19-b839-494f-a4f2-5505fb7ad9ab-config\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.538585 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/129e39d2-1f26-4919-b1d3-70597defd1c8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5mt\" (UID: \"129e39d2-1f26-4919-b1d3-70597defd1c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.538653 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.539888 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b023c669-cb19-4010-b9d7-120bdfff87bd-config\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.540213 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/056fc2a2-f5db-4887-bada-a7215edd00d4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.540938 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jnm2b"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.540966 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.541126 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211ab76e-6958-4f86-9549-06542e81a3e7-serving-cert\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.541481 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fafebd4f-5889-4b08-9e9f-0192504348c9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tgc78\" (UID: \"fafebd4f-5889-4b08-9e9f-0192504348c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.541882 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.542436 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fafebd4f-5889-4b08-9e9f-0192504348c9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tgc78\" (UID: \"fafebd4f-5889-4b08-9e9f-0192504348c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.542515 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/056fc2a2-f5db-4887-bada-a7215edd00d4-etcd-client\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.542572 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/056fc2a2-f5db-4887-bada-a7215edd00d4-serving-cert\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.543038 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.543330 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f033de-0c77-4c9e-bd73-873fe5ecce6c-config\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.543389 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-config\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.543842 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.544195 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.544646 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f033de-0c77-4c9e-bd73-873fe5ecce6c-serving-cert\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.544684 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.545004 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/211ab76e-6958-4f86-9549-06542e81a3e7-encryption-config\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.545080 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/211ab76e-6958-4f86-9549-06542e81a3e7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.545223 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.545460 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-serving-cert\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.547520 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3d8283-7857-4e35-8cf6-bbec3d0e767e-serving-cert\") pod \"openshift-config-operator-7777fb866f-f8cxm\" (UID: \"8f3d8283-7857-4e35-8cf6-bbec3d0e767e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.547655 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.547725 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.548393 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b023c669-cb19-4010-b9d7-120bdfff87bd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.551732 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tbscw"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.551795 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ppvk7"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.552425 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/211ab76e-6958-4f86-9549-06542e81a3e7-etcd-client\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.552503 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8k777"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.554042 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.560389 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.561797 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.565115 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gprnh"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.565939 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.573876 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-sbg9r"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.575271 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.576505 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.576800 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.577952 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.579581 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.581011 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.582065 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.583942 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.585848 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.587954 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dt94w"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.589180 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-z2t2t"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.590852 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8k777"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.594368 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-grlbs"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.596312 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gprnh"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.597034 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.597753 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.599005 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.600128 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.602458 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5rtbd"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.603363 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5rtbd" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.604328 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5rtbd"] Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.617086 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.637231 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.656600 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.677426 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.696747 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.717542 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.737617 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.761840 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.776881 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.797657 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.816988 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.837301 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.858309 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.876880 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.897068 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.917582 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.936883 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.957261 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.977622 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 17:04:09 crc kubenswrapper[4772]: I0930 17:04:09.997256 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.027450 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.038230 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.057555 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.077786 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.097250 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.117760 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.136832 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.157249 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.177001 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.197254 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.216349 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.237367 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.257996 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.277034 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.296507 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.317283 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.336488 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.357245 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.376584 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.396099 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.417164 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.475557 4772 request.go:700] Waited for 1.012700702s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns-operator/secrets?fieldSelector=metadata.name%3Ddns-operator-dockercfg-9mqw5&limit=500&resourceVersion=0 Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.477733 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.497260 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.516914 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.537021 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.557131 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.577923 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.597561 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.617009 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.637816 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.657158 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.677744 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.697984 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.718107 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.737346 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.756184 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.777726 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.798627 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.817389 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.836962 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.857414 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.877089 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.896166 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.916499 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.936854 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.957238 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.977438 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 17:04:10 crc kubenswrapper[4772]: I0930 17:04:10.997178 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.017532 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.045845 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.056744 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.076708 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.097929 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.117507 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.137392 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.179622 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m9vl\" (UniqueName: \"kubernetes.io/projected/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-kube-api-access-4m9vl\") pod \"console-f9d7485db-tm4sk\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.195920 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q78b\" (UniqueName: \"kubernetes.io/projected/211ab76e-6958-4f86-9549-06542e81a3e7-kube-api-access-5q78b\") pod \"apiserver-7bbb656c7d-4x24b\" (UID: \"211ab76e-6958-4f86-9549-06542e81a3e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.225959 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjxp\" (UniqueName: \"kubernetes.io/projected/056fc2a2-f5db-4887-bada-a7215edd00d4-kube-api-access-fjjxp\") pod \"apiserver-76f77b778f-jnm2b\" (UID: \"056fc2a2-f5db-4887-bada-a7215edd00d4\") " pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.235263 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.256298 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h9x8\" (UniqueName: \"kubernetes.io/projected/129e39d2-1f26-4919-b1d3-70597defd1c8-kube-api-access-9h9x8\") pod \"cluster-samples-operator-665b6dd947-kq5mt\" (UID: \"129e39d2-1f26-4919-b1d3-70597defd1c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.271985 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsqnm\" (UniqueName: \"kubernetes.io/projected/fafebd4f-5889-4b08-9e9f-0192504348c9-kube-api-access-bsqnm\") pod \"openshift-apiserver-operator-796bbdcf4f-tgc78\" (UID: \"fafebd4f-5889-4b08-9e9f-0192504348c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.279591 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.291008 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4z5w\" (UniqueName: \"kubernetes.io/projected/f28a4e98-7805-44af-9ba1-4143a95625c5-kube-api-access-j4z5w\") pod \"machine-approver-56656f9798-crt6b\" (UID: \"f28a4e98-7805-44af-9ba1-4143a95625c5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.315609 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p65hm\" (UniqueName: \"kubernetes.io/projected/44f033de-0c77-4c9e-bd73-873fe5ecce6c-kube-api-access-p65hm\") pod \"console-operator-58897d9998-wlsdw\" (UID: \"44f033de-0c77-4c9e-bd73-873fe5ecce6c\") " pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.320374 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.332803 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llxxx\" (UniqueName: \"kubernetes.io/projected/2bda8593-604a-4bf9-9cd1-0d56310dd0f0-kube-api-access-llxxx\") pod \"downloads-7954f5f757-ppvk7\" (UID: \"2bda8593-604a-4bf9-9cd1-0d56310dd0f0\") " pod="openshift-console/downloads-7954f5f757-ppvk7" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.369710 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzgwx\" (UniqueName: \"kubernetes.io/projected/b023c669-cb19-4010-b9d7-120bdfff87bd-kube-api-access-tzgwx\") pod \"machine-api-operator-5694c8668f-klzl8\" (UID: \"b023c669-cb19-4010-b9d7-120bdfff87bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.379463 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.385752 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7rq4\" (UniqueName: \"kubernetes.io/projected/49321d19-b839-494f-a4f2-5505fb7ad9ab-kube-api-access-j7rq4\") pod \"authentication-operator-69f744f599-8rwzz\" (UID: \"49321d19-b839-494f-a4f2-5505fb7ad9ab\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.392731 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.397735 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r8df\" (UniqueName: \"kubernetes.io/projected/63eeced4-9c90-46e5-9234-938f88df7c49-kube-api-access-2r8df\") pod \"oauth-openshift-558db77b4-r4gqw\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.415541 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxplw\" (UniqueName: \"kubernetes.io/projected/c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715-kube-api-access-sxplw\") pod \"cluster-image-registry-operator-dc59b4c8b-7trjg\" (UID: \"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.432692 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msvm5\" (UniqueName: \"kubernetes.io/projected/8f3d8283-7857-4e35-8cf6-bbec3d0e767e-kube-api-access-msvm5\") pod \"openshift-config-operator-7777fb866f-f8cxm\" (UID: \"8f3d8283-7857-4e35-8cf6-bbec3d0e767e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.452340 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.452841 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d29905da-24ff-4cf7-93f2-4b20a1a9b934-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-swt8h\" (UID: \"d29905da-24ff-4cf7-93f2-4b20a1a9b934\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.457046 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.475938 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x5tt\" (UniqueName: \"kubernetes.io/projected/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-kube-api-access-9x5tt\") pod \"route-controller-manager-6576b87f9c-jwvjm\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.477727 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.478444 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.484099 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt"] Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.495318 4772 request.go:700] Waited for 1.940918072s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.498279 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.503692 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.517640 4772 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.536916 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.545414 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.551153 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.561951 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.578452 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.589167 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ppvk7" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.593042 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.601533 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.618592 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.619392 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.637143 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.659813 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.686035 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tm4sk"] Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.731221 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.739349 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" event={"ID":"f28a4e98-7805-44af-9ba1-4143a95625c5","Type":"ContainerStarted","Data":"2814fa23c53ff4210491820a22379b5fb601d6a52d4071707f7d69696dd90d2f"} Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.739396 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" event={"ID":"f28a4e98-7805-44af-9ba1-4143a95625c5","Type":"ContainerStarted","Data":"64522ebe1dc01650e833d7bfa456e52693d19b74b466ec3a9a0ff1221d84f55b"} Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.747960 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" event={"ID":"129e39d2-1f26-4919-b1d3-70597defd1c8","Type":"ContainerStarted","Data":"0d60ab2b6de2fd4c12569217458633e72c3f79c11f3798c8ace35264cf3e9e62"} Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.750848 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tm4sk" event={"ID":"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7","Type":"ContainerStarted","Data":"998485b1c9b72c3a49c26832d0c80244e39846f8113a4b353c4d6fcd69b0181c"} Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.753718 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-registry-tls\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.753890 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d27083b2-f7dd-41bb-b3b1-5eae15310453-config\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.753915 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pkshn\" (UID: \"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.753964 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.753980 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289c39b0-d332-437c-8392-eedbf591e057-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sghvc\" (UID: \"289c39b0-d332-437c-8392-eedbf591e057\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754073 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a5d3cdf-3b39-4f66-a393-f0665cc68da7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-z2t2t\" (UID: \"9a5d3cdf-3b39-4f66-a393-f0665cc68da7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754109 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38933d3b-1f86-415d-923c-c8366e93021f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754130 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d27083b2-f7dd-41bb-b3b1-5eae15310453-etcd-client\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754265 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e81373-125e-4a51-875e-455dd284fa9a-service-ca-bundle\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754295 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd2wd\" (UniqueName: \"kubernetes.io/projected/87d8b027-8111-447a-b6b2-da78394a12ef-kube-api-access-pd2wd\") pod \"machine-config-controller-84d6567774-9bdx2\" (UID: \"87d8b027-8111-447a-b6b2-da78394a12ef\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754314 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld4r5\" (UniqueName: \"kubernetes.io/projected/16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb-kube-api-access-ld4r5\") pod \"openshift-controller-manager-operator-756b6f6bc6-pkshn\" (UID: \"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754342 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38933d3b-1f86-415d-923c-c8366e93021f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754394 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hldm\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-kube-api-access-5hldm\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754416 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da0f03cd-b63a-4eed-b9bd-22260e305ef9-metrics-tls\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754442 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d27083b2-f7dd-41bb-b3b1-5eae15310453-etcd-ca\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754459 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-serving-cert\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754499 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8da1195c-0df9-4c38-b016-c71d6e7b612a-proxy-tls\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754564 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-client-ca\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754582 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-bound-sa-token\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754609 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-registry-certificates\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754635 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws84h\" (UniqueName: \"kubernetes.io/projected/42c65972-2272-40d8-a0d8-fa2cf83449c1-kube-api-access-ws84h\") pod \"migrator-59844c95c7-dds9w\" (UID: \"42c65972-2272-40d8-a0d8-fa2cf83449c1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754651 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpzbp\" (UniqueName: \"kubernetes.io/projected/da0f03cd-b63a-4eed-b9bd-22260e305ef9-kube-api-access-cpzbp\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754690 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4696d05-0539-4fe2-83b4-aab389bcf124-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gn9w\" (UID: \"e4696d05-0539-4fe2-83b4-aab389bcf124\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754723 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nfrj\" (UniqueName: \"kubernetes.io/projected/9a5d3cdf-3b39-4f66-a393-f0665cc68da7-kube-api-access-5nfrj\") pod \"multus-admission-controller-857f4d67dd-z2t2t\" (UID: \"9a5d3cdf-3b39-4f66-a393-f0665cc68da7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754739 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-config\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754758 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da0f03cd-b63a-4eed-b9bd-22260e305ef9-trusted-ca\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754807 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289c39b0-d332-437c-8392-eedbf591e057-config\") pod \"kube-apiserver-operator-766d6c64bb-sghvc\" (UID: \"289c39b0-d332-437c-8392-eedbf591e057\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754859 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8e81373-125e-4a51-875e-455dd284fa9a-metrics-certs\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754892 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pkshn\" (UID: \"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754944 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.754972 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d27083b2-f7dd-41bb-b3b1-5eae15310453-etcd-service-ca\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.755005 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8e81373-125e-4a51-875e-455dd284fa9a-default-certificate\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.755042 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c3c602c-177d-4e38-b503-b449586c6bf1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-79czd\" (UID: \"0c3c602c-177d-4e38-b503-b449586c6bf1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:11 crc kubenswrapper[4772]: E0930 17:04:11.759982 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:12.259961872 +0000 UTC m=+153.166974703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.765136 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.777302 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp7gg\" (UniqueName: \"kubernetes.io/projected/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-kube-api-access-dp7gg\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.777444 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87d8b027-8111-447a-b6b2-da78394a12ef-proxy-tls\") pod \"machine-config-controller-84d6567774-9bdx2\" (UID: \"87d8b027-8111-447a-b6b2-da78394a12ef\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.777507 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3c602c-177d-4e38-b503-b449586c6bf1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-79czd\" (UID: \"0c3c602c-177d-4e38-b503-b449586c6bf1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.777544 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxb4c\" (UniqueName: \"kubernetes.io/projected/e4696d05-0539-4fe2-83b4-aab389bcf124-kube-api-access-cxb4c\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gn9w\" (UID: \"e4696d05-0539-4fe2-83b4-aab389bcf124\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778262 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb2js\" (UniqueName: \"kubernetes.io/projected/f8e81373-125e-4a51-875e-455dd284fa9a-kube-api-access-sb2js\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778326 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da0f03cd-b63a-4eed-b9bd-22260e305ef9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778381 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl4gl\" (UniqueName: \"kubernetes.io/projected/d27083b2-f7dd-41bb-b3b1-5eae15310453-kube-api-access-jl4gl\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778426 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/87d8b027-8111-447a-b6b2-da78394a12ef-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9bdx2\" (UID: \"87d8b027-8111-447a-b6b2-da78394a12ef\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778477 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-trusted-ca\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778505 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d27083b2-f7dd-41bb-b3b1-5eae15310453-serving-cert\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778561 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4696d05-0539-4fe2-83b4-aab389bcf124-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gn9w\" (UID: \"e4696d05-0539-4fe2-83b4-aab389bcf124\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778712 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/289c39b0-d332-437c-8392-eedbf591e057-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sghvc\" (UID: \"289c39b0-d332-437c-8392-eedbf591e057\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778747 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c3c602c-177d-4e38-b503-b449586c6bf1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-79czd\" (UID: \"0c3c602c-177d-4e38-b503-b449586c6bf1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778827 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrh85\" (UniqueName: \"kubernetes.io/projected/8da1195c-0df9-4c38-b016-c71d6e7b612a-kube-api-access-jrh85\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778869 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8da1195c-0df9-4c38-b016-c71d6e7b612a-images\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778893 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8da1195c-0df9-4c38-b016-c71d6e7b612a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.778953 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8e81373-125e-4a51-875e-455dd284fa9a-stats-auth\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883087 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:11 crc kubenswrapper[4772]: E0930 17:04:11.883296 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:12.383256914 +0000 UTC m=+153.290269755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883410 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld4r5\" (UniqueName: \"kubernetes.io/projected/16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb-kube-api-access-ld4r5\") pod \"openshift-controller-manager-operator-756b6f6bc6-pkshn\" (UID: \"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883443 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38933d3b-1f86-415d-923c-c8366e93021f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883478 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hldm\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-kube-api-access-5hldm\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883504 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf5270f1-1160-4e12-ab8e-94f95b79ab1d-signing-key\") pod \"service-ca-9c57cc56f-grlbs\" (UID: \"cf5270f1-1160-4e12-ab8e-94f95b79ab1d\") " pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883524 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da0f03cd-b63a-4eed-b9bd-22260e305ef9-metrics-tls\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883557 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d27083b2-f7dd-41bb-b3b1-5eae15310453-etcd-ca\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883578 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-serving-cert\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883596 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5a7e8214-29ef-48d7-aea5-bdca17750404-profile-collector-cert\") pod \"catalog-operator-68c6474976-7g4kc\" (UID: \"5a7e8214-29ef-48d7-aea5-bdca17750404\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883629 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8da1195c-0df9-4c38-b016-c71d6e7b612a-proxy-tls\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883647 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-client-ca\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883666 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8423c1fe-7e8a-4848-bb77-c8ab059319fc-apiservice-cert\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883713 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-bound-sa-token\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883734 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-csi-data-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883753 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-registry-certificates\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883785 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws84h\" (UniqueName: \"kubernetes.io/projected/42c65972-2272-40d8-a0d8-fa2cf83449c1-kube-api-access-ws84h\") pod \"migrator-59844c95c7-dds9w\" (UID: \"42c65972-2272-40d8-a0d8-fa2cf83449c1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883809 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpzbp\" (UniqueName: \"kubernetes.io/projected/da0f03cd-b63a-4eed-b9bd-22260e305ef9-kube-api-access-cpzbp\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883830 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8b47c59f-07ad-4c56-b65b-bc0598b7f456-certs\") pod \"machine-config-server-tqlhz\" (UID: \"8b47c59f-07ad-4c56-b65b-bc0598b7f456\") " pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883866 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4696d05-0539-4fe2-83b4-aab389bcf124-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gn9w\" (UID: \"e4696d05-0539-4fe2-83b4-aab389bcf124\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883889 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nfrj\" (UniqueName: \"kubernetes.io/projected/9a5d3cdf-3b39-4f66-a393-f0665cc68da7-kube-api-access-5nfrj\") pod \"multus-admission-controller-857f4d67dd-z2t2t\" (UID: \"9a5d3cdf-3b39-4f66-a393-f0665cc68da7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.883909 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-config\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.884244 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da0f03cd-b63a-4eed-b9bd-22260e305ef9-trusted-ca\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.884270 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c-srv-cert\") pod \"olm-operator-6b444d44fb-7p9mp\" (UID: \"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.884307 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289c39b0-d332-437c-8392-eedbf591e057-config\") pod \"kube-apiserver-operator-766d6c64bb-sghvc\" (UID: \"289c39b0-d332-437c-8392-eedbf591e057\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.884324 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a68e96d-d547-4060-8ab8-c693324a4423-secret-volume\") pod \"collect-profiles-29320860-hlnzc\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.884341 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfmjp\" (UniqueName: \"kubernetes.io/projected/aa56ffe7-d880-43d4-b0bb-135e1016d110-kube-api-access-sfmjp\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.884378 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdmvr\" (UID: \"0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.884397 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8e81373-125e-4a51-875e-455dd284fa9a-metrics-certs\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885018 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38933d3b-1f86-415d-923c-c8366e93021f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.884421 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf5270f1-1160-4e12-ab8e-94f95b79ab1d-signing-cabundle\") pod \"service-ca-9c57cc56f-grlbs\" (UID: \"cf5270f1-1160-4e12-ab8e-94f95b79ab1d\") " pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885528 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pkshn\" (UID: \"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885554 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mj4hj\" (UID: \"e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885598 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/236d3ca8-1434-428c-a244-d6ed1ca8a299-config-volume\") pod \"dns-default-gprnh\" (UID: \"236d3ca8-1434-428c-a244-d6ed1ca8a299\") " pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885617 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/236d3ca8-1434-428c-a244-d6ed1ca8a299-metrics-tls\") pod \"dns-default-gprnh\" (UID: \"236d3ca8-1434-428c-a244-d6ed1ca8a299\") " pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885635 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf9kf\" (UniqueName: \"kubernetes.io/projected/8b47c59f-07ad-4c56-b65b-bc0598b7f456-kube-api-access-pf9kf\") pod \"machine-config-server-tqlhz\" (UID: \"8b47c59f-07ad-4c56-b65b-bc0598b7f456\") " pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885670 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885691 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5a7e8214-29ef-48d7-aea5-bdca17750404-srv-cert\") pod \"catalog-operator-68c6474976-7g4kc\" (UID: \"5a7e8214-29ef-48d7-aea5-bdca17750404\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885719 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6d97196-ddfc-4eff-9a79-d4c8e3698c49-serving-cert\") pod \"service-ca-operator-777779d784-z4m7l\" (UID: \"c6d97196-ddfc-4eff-9a79-d4c8e3698c49\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885755 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4845\" (UniqueName: \"kubernetes.io/projected/ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c-kube-api-access-t4845\") pod \"olm-operator-6b444d44fb-7p9mp\" (UID: \"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885777 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8b47c59f-07ad-4c56-b65b-bc0598b7f456-node-bootstrap-token\") pod \"machine-config-server-tqlhz\" (UID: \"8b47c59f-07ad-4c56-b65b-bc0598b7f456\") " pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d27083b2-f7dd-41bb-b3b1-5eae15310453-etcd-service-ca\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885833 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8e81373-125e-4a51-875e-455dd284fa9a-default-certificate\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.885906 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-mountpoint-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.886122 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-client-ca\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.887201 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289c39b0-d332-437c-8392-eedbf591e057-config\") pod \"kube-apiserver-operator-766d6c64bb-sghvc\" (UID: \"289c39b0-d332-437c-8392-eedbf591e057\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.888769 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pkshn\" (UID: \"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.888940 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c3c602c-177d-4e38-b503-b449586c6bf1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-79czd\" (UID: \"0c3c602c-177d-4e38-b503-b449586c6bf1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.889776 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp7gg\" (UniqueName: \"kubernetes.io/projected/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-kube-api-access-dp7gg\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.889834 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhzhw\" (UniqueName: \"kubernetes.io/projected/cf5270f1-1160-4e12-ab8e-94f95b79ab1d-kube-api-access-zhzhw\") pod \"service-ca-9c57cc56f-grlbs\" (UID: \"cf5270f1-1160-4e12-ab8e-94f95b79ab1d\") " pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.889865 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87d8b027-8111-447a-b6b2-da78394a12ef-proxy-tls\") pod \"machine-config-controller-84d6567774-9bdx2\" (UID: \"87d8b027-8111-447a-b6b2-da78394a12ef\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.889895 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3c602c-177d-4e38-b503-b449586c6bf1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-79czd\" (UID: \"0c3c602c-177d-4e38-b503-b449586c6bf1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.889905 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d27083b2-f7dd-41bb-b3b1-5eae15310453-etcd-ca\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.889920 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxb4c\" (UniqueName: \"kubernetes.io/projected/e4696d05-0539-4fe2-83b4-aab389bcf124-kube-api-access-cxb4c\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gn9w\" (UID: \"e4696d05-0539-4fe2-83b4-aab389bcf124\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.889990 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb2js\" (UniqueName: \"kubernetes.io/projected/f8e81373-125e-4a51-875e-455dd284fa9a-kube-api-access-sb2js\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890017 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8423c1fe-7e8a-4848-bb77-c8ab059319fc-tmpfs\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890040 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da0f03cd-b63a-4eed-b9bd-22260e305ef9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890041 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-registry-certificates\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890081 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl4gl\" (UniqueName: \"kubernetes.io/projected/d27083b2-f7dd-41bb-b3b1-5eae15310453-kube-api-access-jl4gl\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890105 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/87d8b027-8111-447a-b6b2-da78394a12ef-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9bdx2\" (UID: \"87d8b027-8111-447a-b6b2-da78394a12ef\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890124 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7p9mp\" (UID: \"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890145 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-trusted-ca\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890163 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mjdw\" (UniqueName: \"kubernetes.io/projected/66cf67d4-4359-478a-93b2-91415bc629f0-kube-api-access-7mjdw\") pod \"dns-operator-744455d44c-sbg9r\" (UID: \"66cf67d4-4359-478a-93b2-91415bc629f0\") " pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890202 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d27083b2-f7dd-41bb-b3b1-5eae15310453-serving-cert\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890222 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4696d05-0539-4fe2-83b4-aab389bcf124-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gn9w\" (UID: \"e4696d05-0539-4fe2-83b4-aab389bcf124\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890246 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/289c39b0-d332-437c-8392-eedbf591e057-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sghvc\" (UID: \"289c39b0-d332-437c-8392-eedbf591e057\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890267 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c3c602c-177d-4e38-b503-b449586c6bf1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-79czd\" (UID: \"0c3c602c-177d-4e38-b503-b449586c6bf1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890289 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnzjr\" (UniqueName: \"kubernetes.io/projected/8423c1fe-7e8a-4848-bb77-c8ab059319fc-kube-api-access-qnzjr\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890309 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46n9n\" (UniqueName: \"kubernetes.io/projected/5a7e8214-29ef-48d7-aea5-bdca17750404-kube-api-access-46n9n\") pod \"catalog-operator-68c6474976-7g4kc\" (UID: \"5a7e8214-29ef-48d7-aea5-bdca17750404\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890342 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8423c1fe-7e8a-4848-bb77-c8ab059319fc-webhook-cert\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890375 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrh85\" (UniqueName: \"kubernetes.io/projected/8da1195c-0df9-4c38-b016-c71d6e7b612a-kube-api-access-jrh85\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890410 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8da1195c-0df9-4c38-b016-c71d6e7b612a-images\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890431 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8da1195c-0df9-4c38-b016-c71d6e7b612a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890453 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tbscw\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890472 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8zwb\" (UniqueName: \"kubernetes.io/projected/6e905513-23f6-4e8f-95df-0668beaad53d-kube-api-access-d8zwb\") pod \"marketplace-operator-79b997595-tbscw\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890491 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66cf67d4-4359-478a-93b2-91415bc629f0-metrics-tls\") pod \"dns-operator-744455d44c-sbg9r\" (UID: \"66cf67d4-4359-478a-93b2-91415bc629f0\") " pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890507 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf762\" (UniqueName: \"kubernetes.io/projected/c6d97196-ddfc-4eff-9a79-d4c8e3698c49-kube-api-access-jf762\") pod \"service-ca-operator-777779d784-z4m7l\" (UID: \"c6d97196-ddfc-4eff-9a79-d4c8e3698c49\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890524 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-plugins-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890545 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8e81373-125e-4a51-875e-455dd284fa9a-stats-auth\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890563 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-registry-tls\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890582 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-socket-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890605 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d27083b2-f7dd-41bb-b3b1-5eae15310453-config\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890626 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pkshn\" (UID: \"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890644 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvhz2\" (UniqueName: \"kubernetes.io/projected/e81ac593-66e9-4480-bdab-3509eb2f23ad-kube-api-access-gvhz2\") pod \"ingress-canary-5rtbd\" (UID: \"e81ac593-66e9-4480-bdab-3509eb2f23ad\") " pod="openshift-ingress-canary/ingress-canary-5rtbd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890663 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e81ac593-66e9-4480-bdab-3509eb2f23ad-cert\") pod \"ingress-canary-5rtbd\" (UID: \"e81ac593-66e9-4480-bdab-3509eb2f23ad\") " pod="openshift-ingress-canary/ingress-canary-5rtbd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890683 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz9s7\" (UniqueName: \"kubernetes.io/projected/8a68e96d-d547-4060-8ab8-c693324a4423-kube-api-access-xz9s7\") pod \"collect-profiles-29320860-hlnzc\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890701 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tbscw\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890721 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g67kd\" (UniqueName: \"kubernetes.io/projected/0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9-kube-api-access-g67kd\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdmvr\" (UID: \"0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890741 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-registration-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890754 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3c602c-177d-4e38-b503-b449586c6bf1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-79czd\" (UID: \"0c3c602c-177d-4e38-b503-b449586c6bf1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890769 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289c39b0-d332-437c-8392-eedbf591e057-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sghvc\" (UID: \"289c39b0-d332-437c-8392-eedbf591e057\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890801 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890817 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d97196-ddfc-4eff-9a79-d4c8e3698c49-config\") pod \"service-ca-operator-777779d784-z4m7l\" (UID: \"c6d97196-ddfc-4eff-9a79-d4c8e3698c49\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890910 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a5d3cdf-3b39-4f66-a393-f0665cc68da7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-z2t2t\" (UID: \"9a5d3cdf-3b39-4f66-a393-f0665cc68da7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890947 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q69q\" (UniqueName: \"kubernetes.io/projected/e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2-kube-api-access-9q69q\") pod \"package-server-manager-789f6589d5-mj4hj\" (UID: \"e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.890977 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a68e96d-d547-4060-8ab8-c693324a4423-config-volume\") pod \"collect-profiles-29320860-hlnzc\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.891010 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38933d3b-1f86-415d-923c-c8366e93021f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.891045 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d27083b2-f7dd-41bb-b3b1-5eae15310453-etcd-client\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.891117 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e81373-125e-4a51-875e-455dd284fa9a-service-ca-bundle\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.891137 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq4kd\" (UniqueName: \"kubernetes.io/projected/236d3ca8-1434-428c-a244-d6ed1ca8a299-kube-api-access-lq4kd\") pod \"dns-default-gprnh\" (UID: \"236d3ca8-1434-428c-a244-d6ed1ca8a299\") " pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.891162 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd2wd\" (UniqueName: \"kubernetes.io/projected/87d8b027-8111-447a-b6b2-da78394a12ef-kube-api-access-pd2wd\") pod \"machine-config-controller-84d6567774-9bdx2\" (UID: \"87d8b027-8111-447a-b6b2-da78394a12ef\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.891402 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8da1195c-0df9-4c38-b016-c71d6e7b612a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: E0930 17:04:11.891665 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:12.391629502 +0000 UTC m=+153.298642333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.891654 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d27083b2-f7dd-41bb-b3b1-5eae15310453-etcd-service-ca\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.892387 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/87d8b027-8111-447a-b6b2-da78394a12ef-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9bdx2\" (UID: \"87d8b027-8111-447a-b6b2-da78394a12ef\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.892461 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d27083b2-f7dd-41bb-b3b1-5eae15310453-config\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.893272 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8da1195c-0df9-4c38-b016-c71d6e7b612a-images\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.893579 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-trusted-ca\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.893681 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e81373-125e-4a51-875e-455dd284fa9a-service-ca-bundle\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.895944 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8e81373-125e-4a51-875e-455dd284fa9a-default-certificate\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.895957 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da0f03cd-b63a-4eed-b9bd-22260e305ef9-metrics-tls\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.896035 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-config\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.896609 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4696d05-0539-4fe2-83b4-aab389bcf124-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gn9w\" (UID: \"e4696d05-0539-4fe2-83b4-aab389bcf124\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.897005 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-serving-cert\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.900712 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d27083b2-f7dd-41bb-b3b1-5eae15310453-etcd-client\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.901174 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8e81373-125e-4a51-875e-455dd284fa9a-metrics-certs\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.902947 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38933d3b-1f86-415d-923c-c8366e93021f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.906774 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-registry-tls\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.907094 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d27083b2-f7dd-41bb-b3b1-5eae15310453-serving-cert\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.907717 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8e81373-125e-4a51-875e-455dd284fa9a-stats-auth\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.913120 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a5d3cdf-3b39-4f66-a393-f0665cc68da7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-z2t2t\" (UID: \"9a5d3cdf-3b39-4f66-a393-f0665cc68da7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.914189 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289c39b0-d332-437c-8392-eedbf591e057-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sghvc\" (UID: \"289c39b0-d332-437c-8392-eedbf591e057\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.915878 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pkshn\" (UID: \"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.916201 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da0f03cd-b63a-4eed-b9bd-22260e305ef9-trusted-ca\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.916473 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c3c602c-177d-4e38-b503-b449586c6bf1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-79czd\" (UID: \"0c3c602c-177d-4e38-b503-b449586c6bf1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.916714 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4696d05-0539-4fe2-83b4-aab389bcf124-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gn9w\" (UID: \"e4696d05-0539-4fe2-83b4-aab389bcf124\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.920252 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87d8b027-8111-447a-b6b2-da78394a12ef-proxy-tls\") pod \"machine-config-controller-84d6567774-9bdx2\" (UID: \"87d8b027-8111-447a-b6b2-da78394a12ef\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.922890 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld4r5\" (UniqueName: \"kubernetes.io/projected/16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb-kube-api-access-ld4r5\") pod \"openshift-controller-manager-operator-756b6f6bc6-pkshn\" (UID: \"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.957813 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8da1195c-0df9-4c38-b016-c71d6e7b612a-proxy-tls\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.960978 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hldm\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-kube-api-access-5hldm\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.971147 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpzbp\" (UniqueName: \"kubernetes.io/projected/da0f03cd-b63a-4eed-b9bd-22260e305ef9-kube-api-access-cpzbp\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.975764 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nfrj\" (UniqueName: \"kubernetes.io/projected/9a5d3cdf-3b39-4f66-a393-f0665cc68da7-kube-api-access-5nfrj\") pod \"multus-admission-controller-857f4d67dd-z2t2t\" (UID: \"9a5d3cdf-3b39-4f66-a393-f0665cc68da7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.991662 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.991814 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhzhw\" (UniqueName: \"kubernetes.io/projected/cf5270f1-1160-4e12-ab8e-94f95b79ab1d-kube-api-access-zhzhw\") pod \"service-ca-9c57cc56f-grlbs\" (UID: \"cf5270f1-1160-4e12-ab8e-94f95b79ab1d\") " pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:11 crc kubenswrapper[4772]: E0930 17:04:11.991880 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:12.491852793 +0000 UTC m=+153.398865664 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.991947 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8423c1fe-7e8a-4848-bb77-c8ab059319fc-tmpfs\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992020 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7p9mp\" (UID: \"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992089 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mjdw\" (UniqueName: \"kubernetes.io/projected/66cf67d4-4359-478a-93b2-91415bc629f0-kube-api-access-7mjdw\") pod \"dns-operator-744455d44c-sbg9r\" (UID: \"66cf67d4-4359-478a-93b2-91415bc629f0\") " pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992140 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnzjr\" (UniqueName: \"kubernetes.io/projected/8423c1fe-7e8a-4848-bb77-c8ab059319fc-kube-api-access-qnzjr\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992166 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46n9n\" (UniqueName: \"kubernetes.io/projected/5a7e8214-29ef-48d7-aea5-bdca17750404-kube-api-access-46n9n\") pod \"catalog-operator-68c6474976-7g4kc\" (UID: \"5a7e8214-29ef-48d7-aea5-bdca17750404\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992181 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8423c1fe-7e8a-4848-bb77-c8ab059319fc-webhook-cert\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992228 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66cf67d4-4359-478a-93b2-91415bc629f0-metrics-tls\") pod \"dns-operator-744455d44c-sbg9r\" (UID: \"66cf67d4-4359-478a-93b2-91415bc629f0\") " pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992245 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tbscw\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992260 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8zwb\" (UniqueName: \"kubernetes.io/projected/6e905513-23f6-4e8f-95df-0668beaad53d-kube-api-access-d8zwb\") pod \"marketplace-operator-79b997595-tbscw\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992279 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf762\" (UniqueName: \"kubernetes.io/projected/c6d97196-ddfc-4eff-9a79-d4c8e3698c49-kube-api-access-jf762\") pod \"service-ca-operator-777779d784-z4m7l\" (UID: \"c6d97196-ddfc-4eff-9a79-d4c8e3698c49\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992296 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-plugins-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992311 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-socket-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992330 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvhz2\" (UniqueName: \"kubernetes.io/projected/e81ac593-66e9-4480-bdab-3509eb2f23ad-kube-api-access-gvhz2\") pod \"ingress-canary-5rtbd\" (UID: \"e81ac593-66e9-4480-bdab-3509eb2f23ad\") " pod="openshift-ingress-canary/ingress-canary-5rtbd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992346 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-registration-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992360 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e81ac593-66e9-4480-bdab-3509eb2f23ad-cert\") pod \"ingress-canary-5rtbd\" (UID: \"e81ac593-66e9-4480-bdab-3509eb2f23ad\") " pod="openshift-ingress-canary/ingress-canary-5rtbd" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992375 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9s7\" (UniqueName: \"kubernetes.io/projected/8a68e96d-d547-4060-8ab8-c693324a4423-kube-api-access-xz9s7\") pod \"collect-profiles-29320860-hlnzc\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992389 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tbscw\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992411 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g67kd\" (UniqueName: \"kubernetes.io/projected/0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9-kube-api-access-g67kd\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdmvr\" (UID: \"0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992432 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992448 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d97196-ddfc-4eff-9a79-d4c8e3698c49-config\") pod \"service-ca-operator-777779d784-z4m7l\" (UID: \"c6d97196-ddfc-4eff-9a79-d4c8e3698c49\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992468 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q69q\" (UniqueName: \"kubernetes.io/projected/e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2-kube-api-access-9q69q\") pod \"package-server-manager-789f6589d5-mj4hj\" (UID: \"e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992491 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a68e96d-d547-4060-8ab8-c693324a4423-config-volume\") pod \"collect-profiles-29320860-hlnzc\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992513 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq4kd\" (UniqueName: \"kubernetes.io/projected/236d3ca8-1434-428c-a244-d6ed1ca8a299-kube-api-access-lq4kd\") pod \"dns-default-gprnh\" (UID: \"236d3ca8-1434-428c-a244-d6ed1ca8a299\") " pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992533 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf5270f1-1160-4e12-ab8e-94f95b79ab1d-signing-key\") pod \"service-ca-9c57cc56f-grlbs\" (UID: \"cf5270f1-1160-4e12-ab8e-94f95b79ab1d\") " pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992549 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5a7e8214-29ef-48d7-aea5-bdca17750404-profile-collector-cert\") pod \"catalog-operator-68c6474976-7g4kc\" (UID: \"5a7e8214-29ef-48d7-aea5-bdca17750404\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992565 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8423c1fe-7e8a-4848-bb77-c8ab059319fc-apiservice-cert\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992587 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-csi-data-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992612 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8b47c59f-07ad-4c56-b65b-bc0598b7f456-certs\") pod \"machine-config-server-tqlhz\" (UID: \"8b47c59f-07ad-4c56-b65b-bc0598b7f456\") " pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992629 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c-srv-cert\") pod \"olm-operator-6b444d44fb-7p9mp\" (UID: \"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992644 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfmjp\" (UniqueName: \"kubernetes.io/projected/aa56ffe7-d880-43d4-b0bb-135e1016d110-kube-api-access-sfmjp\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992660 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a68e96d-d547-4060-8ab8-c693324a4423-secret-volume\") pod \"collect-profiles-29320860-hlnzc\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992679 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdmvr\" (UID: \"0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992696 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf5270f1-1160-4e12-ab8e-94f95b79ab1d-signing-cabundle\") pod \"service-ca-9c57cc56f-grlbs\" (UID: \"cf5270f1-1160-4e12-ab8e-94f95b79ab1d\") " pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992717 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mj4hj\" (UID: \"e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992733 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/236d3ca8-1434-428c-a244-d6ed1ca8a299-config-volume\") pod \"dns-default-gprnh\" (UID: \"236d3ca8-1434-428c-a244-d6ed1ca8a299\") " pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992749 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/236d3ca8-1434-428c-a244-d6ed1ca8a299-metrics-tls\") pod \"dns-default-gprnh\" (UID: \"236d3ca8-1434-428c-a244-d6ed1ca8a299\") " pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992766 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf9kf\" (UniqueName: \"kubernetes.io/projected/8b47c59f-07ad-4c56-b65b-bc0598b7f456-kube-api-access-pf9kf\") pod \"machine-config-server-tqlhz\" (UID: \"8b47c59f-07ad-4c56-b65b-bc0598b7f456\") " pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992790 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5a7e8214-29ef-48d7-aea5-bdca17750404-srv-cert\") pod \"catalog-operator-68c6474976-7g4kc\" (UID: \"5a7e8214-29ef-48d7-aea5-bdca17750404\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992805 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6d97196-ddfc-4eff-9a79-d4c8e3698c49-serving-cert\") pod \"service-ca-operator-777779d784-z4m7l\" (UID: \"c6d97196-ddfc-4eff-9a79-d4c8e3698c49\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992821 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4845\" (UniqueName: \"kubernetes.io/projected/ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c-kube-api-access-t4845\") pod \"olm-operator-6b444d44fb-7p9mp\" (UID: \"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992836 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8b47c59f-07ad-4c56-b65b-bc0598b7f456-node-bootstrap-token\") pod \"machine-config-server-tqlhz\" (UID: \"8b47c59f-07ad-4c56-b65b-bc0598b7f456\") " pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992852 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-mountpoint-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992901 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8rwzz"] Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992922 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-mountpoint-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992927 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b"] Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992938 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ppvk7"] Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992948 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78"] Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992957 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg"] Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992967 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jnm2b"] Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.992976 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-klzl8"] Sep 30 17:04:11 crc kubenswrapper[4772]: I0930 17:04:11.993267 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8423c1fe-7e8a-4848-bb77-c8ab059319fc-tmpfs\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.001558 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7p9mp\" (UID: \"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.004209 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdmvr\" (UID: \"0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.004870 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d97196-ddfc-4eff-9a79-d4c8e3698c49-config\") pod \"service-ca-operator-777779d784-z4m7l\" (UID: \"c6d97196-ddfc-4eff-9a79-d4c8e3698c49\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.004931 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a68e96d-d547-4060-8ab8-c693324a4423-config-volume\") pod \"collect-profiles-29320860-hlnzc\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.005413 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/236d3ca8-1434-428c-a244-d6ed1ca8a299-config-volume\") pod \"dns-default-gprnh\" (UID: \"236d3ca8-1434-428c-a244-d6ed1ca8a299\") " pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.007310 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5a7e8214-29ef-48d7-aea5-bdca17750404-srv-cert\") pod \"catalog-operator-68c6474976-7g4kc\" (UID: \"5a7e8214-29ef-48d7-aea5-bdca17750404\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.007411 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-csi-data-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.010972 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf5270f1-1160-4e12-ab8e-94f95b79ab1d-signing-key\") pod \"service-ca-9c57cc56f-grlbs\" (UID: \"cf5270f1-1160-4e12-ab8e-94f95b79ab1d\") " pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.013516 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf5270f1-1160-4e12-ab8e-94f95b79ab1d-signing-cabundle\") pod \"service-ca-9c57cc56f-grlbs\" (UID: \"cf5270f1-1160-4e12-ab8e-94f95b79ab1d\") " pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.014389 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5a7e8214-29ef-48d7-aea5-bdca17750404-profile-collector-cert\") pod \"catalog-operator-68c6474976-7g4kc\" (UID: \"5a7e8214-29ef-48d7-aea5-bdca17750404\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.014739 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-plugins-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.014792 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-socket-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.014893 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aa56ffe7-d880-43d4-b0bb-135e1016d110-registration-dir\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.016387 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws84h\" (UniqueName: \"kubernetes.io/projected/42c65972-2272-40d8-a0d8-fa2cf83449c1-kube-api-access-ws84h\") pod \"migrator-59844c95c7-dds9w\" (UID: \"42c65972-2272-40d8-a0d8-fa2cf83449c1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.016543 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66cf67d4-4359-478a-93b2-91415bc629f0-metrics-tls\") pod \"dns-operator-744455d44c-sbg9r\" (UID: \"66cf67d4-4359-478a-93b2-91415bc629f0\") " pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.016688 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mj4hj\" (UID: \"e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.016900 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.017421 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a68e96d-d547-4060-8ab8-c693324a4423-secret-volume\") pod \"collect-profiles-29320860-hlnzc\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.018432 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8b47c59f-07ad-4c56-b65b-bc0598b7f456-node-bootstrap-token\") pod \"machine-config-server-tqlhz\" (UID: \"8b47c59f-07ad-4c56-b65b-bc0598b7f456\") " pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.018479 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6d97196-ddfc-4eff-9a79-d4c8e3698c49-serving-cert\") pod \"service-ca-operator-777779d784-z4m7l\" (UID: \"c6d97196-ddfc-4eff-9a79-d4c8e3698c49\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.019349 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tbscw\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.019542 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c-srv-cert\") pod \"olm-operator-6b444d44fb-7p9mp\" (UID: \"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.019725 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:12.519704439 +0000 UTC m=+153.426717450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.020826 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8423c1fe-7e8a-4848-bb77-c8ab059319fc-webhook-cert\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.027686 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e81ac593-66e9-4480-bdab-3509eb2f23ad-cert\") pod \"ingress-canary-5rtbd\" (UID: \"e81ac593-66e9-4480-bdab-3509eb2f23ad\") " pod="openshift-ingress-canary/ingress-canary-5rtbd" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.028267 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/236d3ca8-1434-428c-a244-d6ed1ca8a299-metrics-tls\") pod \"dns-default-gprnh\" (UID: \"236d3ca8-1434-428c-a244-d6ed1ca8a299\") " pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.028782 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8423c1fe-7e8a-4848-bb77-c8ab059319fc-apiservice-cert\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.036947 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8b47c59f-07ad-4c56-b65b-bc0598b7f456-certs\") pod \"machine-config-server-tqlhz\" (UID: \"8b47c59f-07ad-4c56-b65b-bc0598b7f456\") " pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.041159 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.046452 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r4gqw"] Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.050586 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tbscw\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.051861 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxb4c\" (UniqueName: \"kubernetes.io/projected/e4696d05-0539-4fe2-83b4-aab389bcf124-kube-api-access-cxb4c\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gn9w\" (UID: \"e4696d05-0539-4fe2-83b4-aab389bcf124\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.057754 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-bound-sa-token\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.071879 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp7gg\" (UniqueName: \"kubernetes.io/projected/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-kube-api-access-dp7gg\") pod \"controller-manager-879f6c89f-ghsks\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.081933 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb2js\" (UniqueName: \"kubernetes.io/projected/f8e81373-125e-4a51-875e-455dd284fa9a-kube-api-access-sb2js\") pod \"router-default-5444994796-n4pdp\" (UID: \"f8e81373-125e-4a51-875e-455dd284fa9a\") " pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.093528 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.094092 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:12.594045015 +0000 UTC m=+153.501057846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.108104 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.109843 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da0f03cd-b63a-4eed-b9bd-22260e305ef9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lxhvh\" (UID: \"da0f03cd-b63a-4eed-b9bd-22260e305ef9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.115269 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wlsdw"] Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.122478 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl4gl\" (UniqueName: \"kubernetes.io/projected/d27083b2-f7dd-41bb-b3b1-5eae15310453-kube-api-access-jl4gl\") pod \"etcd-operator-b45778765-dt94w\" (UID: \"d27083b2-f7dd-41bb-b3b1-5eae15310453\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.134799 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm"] Sep 30 17:04:12 crc kubenswrapper[4772]: W0930 17:04:12.141460 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63eeced4_9c90_46e5_9234_938f88df7c49.slice/crio-87adc10635ca6554c206b32f3f275ecd982fd5fcd3476fc9a227f85df971254e WatchSource:0}: Error finding container 87adc10635ca6554c206b32f3f275ecd982fd5fcd3476fc9a227f85df971254e: Status 404 returned error can't find the container with id 87adc10635ca6554c206b32f3f275ecd982fd5fcd3476fc9a227f85df971254e Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.146536 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/289c39b0-d332-437c-8392-eedbf591e057-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sghvc\" (UID: \"289c39b0-d332-437c-8392-eedbf591e057\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.155350 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c3c602c-177d-4e38-b503-b449586c6bf1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-79czd\" (UID: \"0c3c602c-177d-4e38-b503-b449586c6bf1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.171644 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrh85\" (UniqueName: \"kubernetes.io/projected/8da1195c-0df9-4c38-b016-c71d6e7b612a-kube-api-access-jrh85\") pod \"machine-config-operator-74547568cd-rn7gv\" (UID: \"8da1195c-0df9-4c38-b016-c71d6e7b612a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.175389 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm"] Sep 30 17:04:12 crc kubenswrapper[4772]: W0930 17:04:12.177593 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f033de_0c77_4c9e_bd73_873fe5ecce6c.slice/crio-c0df06524ad62ee46580bddc698ca68f2275d794b00030140c7bf4386e8a66b6 WatchSource:0}: Error finding container c0df06524ad62ee46580bddc698ca68f2275d794b00030140c7bf4386e8a66b6: Status 404 returned error can't find the container with id c0df06524ad62ee46580bddc698ca68f2275d794b00030140c7bf4386e8a66b6 Sep 30 17:04:12 crc kubenswrapper[4772]: W0930 17:04:12.178546 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8187513e_1ddb_4a68_8a95_e5c5b1d2206a.slice/crio-02d3879ae09992eaea5afdf29f10c28fbfa5c34031fe67691ea19209a13a3d14 WatchSource:0}: Error finding container 02d3879ae09992eaea5afdf29f10c28fbfa5c34031fe67691ea19209a13a3d14: Status 404 returned error can't find the container with id 02d3879ae09992eaea5afdf29f10c28fbfa5c34031fe67691ea19209a13a3d14 Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.195331 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.198408 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:12.698372023 +0000 UTC m=+153.605384854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.220435 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd2wd\" (UniqueName: \"kubernetes.io/projected/87d8b027-8111-447a-b6b2-da78394a12ef-kube-api-access-pd2wd\") pod \"machine-config-controller-84d6567774-9bdx2\" (UID: \"87d8b027-8111-447a-b6b2-da78394a12ef\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.253400 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhzhw\" (UniqueName: \"kubernetes.io/projected/cf5270f1-1160-4e12-ab8e-94f95b79ab1d-kube-api-access-zhzhw\") pod \"service-ca-9c57cc56f-grlbs\" (UID: \"cf5270f1-1160-4e12-ab8e-94f95b79ab1d\") " pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.275657 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q69q\" (UniqueName: \"kubernetes.io/projected/e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2-kube-api-access-9q69q\") pod \"package-server-manager-789f6589d5-mj4hj\" (UID: \"e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.287499 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.294730 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h"] Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.297268 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.297900 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:12.797880375 +0000 UTC m=+153.704893206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.300165 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf9kf\" (UniqueName: \"kubernetes.io/projected/8b47c59f-07ad-4c56-b65b-bc0598b7f456-kube-api-access-pf9kf\") pod \"machine-config-server-tqlhz\" (UID: \"8b47c59f-07ad-4c56-b65b-bc0598b7f456\") " pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.310371 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mjdw\" (UniqueName: \"kubernetes.io/projected/66cf67d4-4359-478a-93b2-91415bc629f0-kube-api-access-7mjdw\") pod \"dns-operator-744455d44c-sbg9r\" (UID: \"66cf67d4-4359-478a-93b2-91415bc629f0\") " pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.320163 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.320781 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnzjr\" (UniqueName: \"kubernetes.io/projected/8423c1fe-7e8a-4848-bb77-c8ab059319fc-kube-api-access-qnzjr\") pod \"packageserver-d55dfcdfc-stgbr\" (UID: \"8423c1fe-7e8a-4848-bb77-c8ab059319fc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.326145 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.334826 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.338566 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46n9n\" (UniqueName: \"kubernetes.io/projected/5a7e8214-29ef-48d7-aea5-bdca17750404-kube-api-access-46n9n\") pod \"catalog-operator-68c6474976-7g4kc\" (UID: \"5a7e8214-29ef-48d7-aea5-bdca17750404\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.354379 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq4kd\" (UniqueName: \"kubernetes.io/projected/236d3ca8-1434-428c-a244-d6ed1ca8a299-kube-api-access-lq4kd\") pod \"dns-default-gprnh\" (UID: \"236d3ca8-1434-428c-a244-d6ed1ca8a299\") " pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.359740 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.367926 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.369250 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn"] Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.372285 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4845\" (UniqueName: \"kubernetes.io/projected/ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c-kube-api-access-t4845\") pod \"olm-operator-6b444d44fb-7p9mp\" (UID: \"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.381113 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.385610 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.392235 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8zwb\" (UniqueName: \"kubernetes.io/projected/6e905513-23f6-4e8f-95df-0668beaad53d-kube-api-access-d8zwb\") pod \"marketplace-operator-79b997595-tbscw\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.398186 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.399134 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.399610 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:12.899585775 +0000 UTC m=+153.806598666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.421521 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.425371 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf762\" (UniqueName: \"kubernetes.io/projected/c6d97196-ddfc-4eff-9a79-d4c8e3698c49-kube-api-access-jf762\") pod \"service-ca-operator-777779d784-z4m7l\" (UID: \"c6d97196-ddfc-4eff-9a79-d4c8e3698c49\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.430439 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.434984 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvhz2\" (UniqueName: \"kubernetes.io/projected/e81ac593-66e9-4480-bdab-3509eb2f23ad-kube-api-access-gvhz2\") pod \"ingress-canary-5rtbd\" (UID: \"e81ac593-66e9-4480-bdab-3509eb2f23ad\") " pod="openshift-ingress-canary/ingress-canary-5rtbd" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.440303 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.446538 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-z2t2t"] Sep 30 17:04:12 crc kubenswrapper[4772]: W0930 17:04:12.463410 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd29905da_24ff_4cf7_93f2_4b20a1a9b934.slice/crio-486c94c6069b8d48e30932a5679f34991e21c4fb64c1a62edfa518384b3bd9b5 WatchSource:0}: Error finding container 486c94c6069b8d48e30932a5679f34991e21c4fb64c1a62edfa518384b3bd9b5: Status 404 returned error can't find the container with id 486c94c6069b8d48e30932a5679f34991e21c4fb64c1a62edfa518384b3bd9b5 Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.465271 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz9s7\" (UniqueName: \"kubernetes.io/projected/8a68e96d-d547-4060-8ab8-c693324a4423-kube-api-access-xz9s7\") pod \"collect-profiles-29320860-hlnzc\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.474354 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.480087 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g67kd\" (UniqueName: \"kubernetes.io/projected/0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9-kube-api-access-g67kd\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdmvr\" (UID: \"0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.481934 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.488660 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.493853 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfmjp\" (UniqueName: \"kubernetes.io/projected/aa56ffe7-d880-43d4-b0bb-135e1016d110-kube-api-access-sfmjp\") pod \"csi-hostpathplugin-8k777\" (UID: \"aa56ffe7-d880-43d4-b0bb-135e1016d110\") " pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.499197 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.500488 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.500890 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.000867634 +0000 UTC m=+153.907880465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.502748 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w"] Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.513764 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tqlhz" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.529585 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-8k777" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.546213 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.551277 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5rtbd" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.601796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.602940 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.102893951 +0000 UTC m=+154.009906822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.704432 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.704639 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.204597631 +0000 UTC m=+154.111610462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.704809 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.705159 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.205141215 +0000 UTC m=+154.112154046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.712853 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.748633 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.761697 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.764446 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" event={"ID":"f28a4e98-7805-44af-9ba1-4143a95625c5","Type":"ContainerStarted","Data":"637544e02aad53f41e130980d23bef891e1151b25a320226ef2b46944b49ca04"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.771420 4772 generic.go:334] "Generic (PLEG): container finished" podID="211ab76e-6958-4f86-9549-06542e81a3e7" containerID="0f92259ba63e9ebdd7e3bfc57231ebd25ec35ad555a2e0340906d9d627402d5d" exitCode=0 Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.771487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" event={"ID":"211ab76e-6958-4f86-9549-06542e81a3e7","Type":"ContainerDied","Data":"0f92259ba63e9ebdd7e3bfc57231ebd25ec35ad555a2e0340906d9d627402d5d"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.773293 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" event={"ID":"211ab76e-6958-4f86-9549-06542e81a3e7","Type":"ContainerStarted","Data":"1b9f36ba8d7cf22fa9ae6136c41ee6ce913a2ef55248b6528ddb04e5810def81"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.786166 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" event={"ID":"9a5d3cdf-3b39-4f66-a393-f0665cc68da7","Type":"ContainerStarted","Data":"002f2566bcc209d8dd73a01b4923b2cbf89b62bd4b48b3e67cf65243779469ca"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.803962 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" event={"ID":"b023c669-cb19-4010-b9d7-120bdfff87bd","Type":"ContainerStarted","Data":"71fbf91b57851940a2f2d2e1ce549fb45c7b6a3781abc822fba284b1251a5c3d"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.804314 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" event={"ID":"b023c669-cb19-4010-b9d7-120bdfff87bd","Type":"ContainerStarted","Data":"1d06375a167adfa62fe64a6d22c41269db5e4777df493fc4b8e46503913fd886"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.806275 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.806558 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.306543767 +0000 UTC m=+154.213556598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.814411 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" event={"ID":"129e39d2-1f26-4919-b1d3-70597defd1c8","Type":"ContainerStarted","Data":"93510f08aabe3f7eaa13eb50487071d6832544033edbc02dea1ea47044a7cc06"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.814460 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" event={"ID":"129e39d2-1f26-4919-b1d3-70597defd1c8","Type":"ContainerStarted","Data":"f20536fbfa5144884dc3ebd7deb9a8d70ab44b69d0f4524d99e29c9317e2c0ce"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.818684 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ppvk7" event={"ID":"2bda8593-604a-4bf9-9cd1-0d56310dd0f0","Type":"ContainerStarted","Data":"6785fcc4acc2341d35381e07785c977e4be6b2457a9358cc872f93b69c91be9a"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.818723 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ppvk7" event={"ID":"2bda8593-604a-4bf9-9cd1-0d56310dd0f0","Type":"ContainerStarted","Data":"b340158c36cfbeec8de6fd11606a295f4897dc8dc9f517f26a53ec9dfb59535f"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.818880 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ppvk7" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.821405 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-ppvk7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.821454 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ppvk7" podUID="2bda8593-604a-4bf9-9cd1-0d56310dd0f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.822910 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w" event={"ID":"42c65972-2272-40d8-a0d8-fa2cf83449c1","Type":"ContainerStarted","Data":"40b4059a5068a0b98d224b0c053d48243d0a2aa8da2a3770916ac527c5e2ee2d"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.824694 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tm4sk" event={"ID":"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7","Type":"ContainerStarted","Data":"50630cf5ccfe12326dbbb3c8ab68a449dceff0f0bf220a060582e638c809cb8b"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.832814 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wlsdw" event={"ID":"44f033de-0c77-4c9e-bd73-873fe5ecce6c","Type":"ContainerStarted","Data":"4e25a53dd0b4e17ae46b6fe0da42e41107c2da664df1ee0eb03eb9f93a633fe0"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.832874 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wlsdw" event={"ID":"44f033de-0c77-4c9e-bd73-873fe5ecce6c","Type":"ContainerStarted","Data":"c0df06524ad62ee46580bddc698ca68f2275d794b00030140c7bf4386e8a66b6"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.834859 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.835878 4772 patch_prober.go:28] interesting pod/console-operator-58897d9998-wlsdw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.835934 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wlsdw" podUID="44f033de-0c77-4c9e-bd73-873fe5ecce6c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.845142 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" event={"ID":"056fc2a2-f5db-4887-bada-a7215edd00d4","Type":"ContainerStarted","Data":"10ab885f28ab9e518bc859529630ac60af8d15fd399245dad3f571d0bb05e750"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.854317 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" event={"ID":"8187513e-1ddb-4a68-8a95-e5c5b1d2206a","Type":"ContainerStarted","Data":"c3bd4890fc12a6785ad5e3df2e0f7b680510b554973819e14af739cac592187b"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.854374 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" event={"ID":"8187513e-1ddb-4a68-8a95-e5c5b1d2206a","Type":"ContainerStarted","Data":"02d3879ae09992eaea5afdf29f10c28fbfa5c34031fe67691ea19209a13a3d14"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.855182 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.876836 4772 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-jwvjm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.876895 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" podUID="8187513e-1ddb-4a68-8a95-e5c5b1d2206a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.912069 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" event={"ID":"49321d19-b839-494f-a4f2-5505fb7ad9ab","Type":"ContainerStarted","Data":"9cbe91fa11b615f979bea315534146dee71dc8a4ce5c5b45c1e6df6e1c2a8322"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.912116 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" event={"ID":"49321d19-b839-494f-a4f2-5505fb7ad9ab","Type":"ContainerStarted","Data":"f0204031104948b0b53379463b37c25bf68b23f6e36eda04f815e493635fab32"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.913759 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:12 crc kubenswrapper[4772]: E0930 17:04:12.916667 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.416651905 +0000 UTC m=+154.323664736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.939720 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" event={"ID":"63eeced4-9c90-46e5-9234-938f88df7c49","Type":"ContainerStarted","Data":"87adc10635ca6554c206b32f3f275ecd982fd5fcd3476fc9a227f85df971254e"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.940891 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.942272 4772 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r4gqw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.942320 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" podUID="63eeced4-9c90-46e5-9234-938f88df7c49" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.963503 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" event={"ID":"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb","Type":"ContainerStarted","Data":"aa9cfdb7918047de2de2107579b775e20e64cb93f675ce1c7241c8b5f64fb559"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.967864 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" event={"ID":"8f3d8283-7857-4e35-8cf6-bbec3d0e767e","Type":"ContainerStarted","Data":"92bd37dd8f40e495fec10138225c124fb8dca0eb3ba68718b468e619ff9f8cfb"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.971225 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" event={"ID":"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715","Type":"ContainerStarted","Data":"34eb8670bf983b7dcecc6abb43c3aa898187faa8d2c3bd6a16cd3be871146049"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.971258 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" event={"ID":"c5b8fb6d-b7cf-4cfa-9217-fe92fe6c7715","Type":"ContainerStarted","Data":"8af1915905a277aee0741b26a794c884cd52d230fa9493d301ccf254842f3d40"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.975566 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" event={"ID":"fafebd4f-5889-4b08-9e9f-0192504348c9","Type":"ContainerStarted","Data":"2b898aa1e51e35e40bb771c4b6bc57f43c0e3e9777d519968416c4bef5458c3a"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.975601 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" event={"ID":"fafebd4f-5889-4b08-9e9f-0192504348c9","Type":"ContainerStarted","Data":"a6f76cc073e9cd50e4bb975d15c5c188522dcf8018fdd95523ebcfd601e33475"} Sep 30 17:04:12 crc kubenswrapper[4772]: I0930 17:04:12.976533 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" event={"ID":"d29905da-24ff-4cf7-93f2-4b20a1a9b934","Type":"ContainerStarted","Data":"486c94c6069b8d48e30932a5679f34991e21c4fb64c1a62edfa518384b3bd9b5"} Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.015423 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.016775 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.516757413 +0000 UTC m=+154.423770244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.120320 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.121452 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.62143375 +0000 UTC m=+154.528446581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.217507 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr"] Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.223943 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.224109 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.724086804 +0000 UTC m=+154.631099645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.224246 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.224579 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.724569287 +0000 UTC m=+154.631582118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.225754 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-sbg9r"] Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.327204 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.328201 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.828179856 +0000 UTC m=+154.735192687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.376408 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-tm4sk" podStartSLOduration=133.376381982 podStartE2EDuration="2m13.376381982s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:13.3229589 +0000 UTC m=+154.229971731" watchObservedRunningTime="2025-09-30 17:04:13.376381982 +0000 UTC m=+154.283394813" Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.391718 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5mt" podStartSLOduration=133.391699541 podStartE2EDuration="2m13.391699541s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:13.391634139 +0000 UTC m=+154.298646970" watchObservedRunningTime="2025-09-30 17:04:13.391699541 +0000 UTC m=+154.298712372" Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.435190 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.437641 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ppvk7" podStartSLOduration=133.437612277 podStartE2EDuration="2m13.437612277s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:13.433439848 +0000 UTC m=+154.340452679" watchObservedRunningTime="2025-09-30 17:04:13.437612277 +0000 UTC m=+154.344625118" Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.441168 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:13.941147779 +0000 UTC m=+154.848160610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.536623 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.537343 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:14.037298014 +0000 UTC m=+154.944310845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.639086 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.640013 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:14.139997149 +0000 UTC m=+155.047009980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.710383 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wlsdw" podStartSLOduration=133.710358672 podStartE2EDuration="2m13.710358672s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:13.708582186 +0000 UTC m=+154.615595007" watchObservedRunningTime="2025-09-30 17:04:13.710358672 +0000 UTC m=+154.617371513" Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.744209 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.744720 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:14.244704207 +0000 UTC m=+155.151717038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.762444 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" podStartSLOduration=132.762427329 podStartE2EDuration="2m12.762427329s" podCreationTimestamp="2025-09-30 17:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:13.761985627 +0000 UTC m=+154.668998458" watchObservedRunningTime="2025-09-30 17:04:13.762427329 +0000 UTC m=+154.669440160" Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.846856 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.847385 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:14.347369771 +0000 UTC m=+155.254382602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.860870 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd"] Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.874009 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv"] Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.936518 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc"] Sep 30 17:04:13 crc kubenswrapper[4772]: I0930 17:04:13.948514 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:13 crc kubenswrapper[4772]: E0930 17:04:13.949672 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:14.449651166 +0000 UTC m=+155.356663997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.010653 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tqlhz" event={"ID":"8b47c59f-07ad-4c56-b65b-bc0598b7f456","Type":"ContainerStarted","Data":"15ec2c669db698a7e4846d2b7ecf86e952788345dd4791bfb997bb08b05af347"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.010709 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tqlhz" event={"ID":"8b47c59f-07ad-4c56-b65b-bc0598b7f456","Type":"ContainerStarted","Data":"6453e34606d34ddbdc80180772b65fc4de1fcb34e9aab3f1a46b47aa36bac7f4"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.013724 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w" event={"ID":"42c65972-2272-40d8-a0d8-fa2cf83449c1","Type":"ContainerStarted","Data":"14a46deb1194b8712034beba7d3510abe1e35931879bbb7b359b162276ef1420"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.013764 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w" event={"ID":"42c65972-2272-40d8-a0d8-fa2cf83449c1","Type":"ContainerStarted","Data":"d725b3f57cf7e977a4724bad82a779fe55ab0f5f9481e0c3f5fc6bee6b647572"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.050776 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:14 crc kubenswrapper[4772]: E0930 17:04:14.051741 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:14.551727755 +0000 UTC m=+155.458740586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.080113 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" event={"ID":"9a5d3cdf-3b39-4f66-a393-f0665cc68da7","Type":"ContainerStarted","Data":"8cb326df4560350040847488a693c252eaa40157b59022cbc07d4150b8020755"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.082194 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" event={"ID":"d29905da-24ff-4cf7-93f2-4b20a1a9b934","Type":"ContainerStarted","Data":"5986f16af14eb89e36dc4c30c2d680d6ccf2623a8579106deaaeed3f5dcfc2fa"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.097994 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" event={"ID":"b023c669-cb19-4010-b9d7-120bdfff87bd","Type":"ContainerStarted","Data":"9a52f5fb1dfbb75f3a9c5ee305262e5800f5b69acfbb5150940eaaf62f34016f"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.134729 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" event={"ID":"63eeced4-9c90-46e5-9234-938f88df7c49","Type":"ContainerStarted","Data":"a4448a5e0816ebb3aaea0e8bf6a247326e27cba12796f5884a432e9e8c636204"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.147230 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.153996 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:14 crc kubenswrapper[4772]: E0930 17:04:14.155279 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:14.655262622 +0000 UTC m=+155.562275443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.162987 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.165472 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dt94w"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.167087 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-n4pdp" event={"ID":"f8e81373-125e-4a51-875e-455dd284fa9a","Type":"ContainerStarted","Data":"7a5b9ae799c19845a05b7227903e0b16adb73e659143d1fc5cb7f01f24a47451"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.167131 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-n4pdp" event={"ID":"f8e81373-125e-4a51-875e-455dd284fa9a","Type":"ContainerStarted","Data":"e54b58b535de5838760d97997dcf1baf0980d8f242d780c7cf9aa6d139c55794"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.172132 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ghsks"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.173922 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" event={"ID":"211ab76e-6958-4f86-9549-06542e81a3e7","Type":"ContainerStarted","Data":"337a3f0b185dade630af4574a5d9812fe62bd7f160c048ed33c6ce4520230cb3"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.196142 4772 generic.go:334] "Generic (PLEG): container finished" podID="056fc2a2-f5db-4887-bada-a7215edd00d4" containerID="d26c42894aab1c2768844449c42522ee78271bc6f5606fbeaf20712d49c2238f" exitCode=0 Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.196216 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" event={"ID":"056fc2a2-f5db-4887-bada-a7215edd00d4","Type":"ContainerDied","Data":"d26c42894aab1c2768844449c42522ee78271bc6f5606fbeaf20712d49c2238f"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.196247 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" event={"ID":"056fc2a2-f5db-4887-bada-a7215edd00d4","Type":"ContainerStarted","Data":"da6046ac8cdfca1f0d7ca674b5742e58b335b05bb93b5db82ccfcfc285bb9538"} Sep 30 17:04:14 crc kubenswrapper[4772]: W0930 17:04:14.202604 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27083b2_f7dd_41bb_b3b1_5eae15310453.slice/crio-4d55c08b8f138c6a53b7e17a96e3f565cad0d83bf4e343705da4cb494a380688 WatchSource:0}: Error finding container 4d55c08b8f138c6a53b7e17a96e3f565cad0d83bf4e343705da4cb494a380688: Status 404 returned error can't find the container with id 4d55c08b8f138c6a53b7e17a96e3f565cad0d83bf4e343705da4cb494a380688 Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.204371 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" event={"ID":"0c3c602c-177d-4e38-b503-b449586c6bf1","Type":"ContainerStarted","Data":"363b57b7128e3418b3d45f29af2c675338e94f73b70d45f81f57c9ac9537fd69"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.216333 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" event={"ID":"16bf5150-0b5b-4fc8-ae9e-1ccc96e59dcb","Type":"ContainerStarted","Data":"14a4e4ae8fc22f7af742df5fbd6a8e0ba0e30102f2ae8c4711369bd2da288e65"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.222904 4772 generic.go:334] "Generic (PLEG): container finished" podID="8f3d8283-7857-4e35-8cf6-bbec3d0e767e" containerID="620cb1e57169c5683c12500b6fa2c34876766ab15501e65a27f490557d947fb2" exitCode=0 Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.222955 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" event={"ID":"8f3d8283-7857-4e35-8cf6-bbec3d0e767e","Type":"ContainerDied","Data":"620cb1e57169c5683c12500b6fa2c34876766ab15501e65a27f490557d947fb2"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.229853 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" event={"ID":"289c39b0-d332-437c-8392-eedbf591e057","Type":"ContainerStarted","Data":"ed0eb68aa40d0233373205bf5d31f04dbf989da492e6cd4c4816a890b3db472f"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.232823 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" event={"ID":"8da1195c-0df9-4c38-b016-c71d6e7b612a","Type":"ContainerStarted","Data":"427a03d55213ea42082e63da38e606eb1f4f695a86193f50b56199ce7fd1f965"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.235493 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" event={"ID":"8423c1fe-7e8a-4848-bb77-c8ab059319fc","Type":"ContainerStarted","Data":"ce066f678266e1d1bd8bb308240b201ea65082ffb8ece69b1280357794cb5713"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.235516 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" event={"ID":"8423c1fe-7e8a-4848-bb77-c8ab059319fc","Type":"ContainerStarted","Data":"c451383bb7b2abf32472709cd663f434475c7b06c859627b6ec9ccdefe0a2502"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.237332 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.258153 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:14 crc kubenswrapper[4772]: E0930 17:04:14.261133 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:14.76111162 +0000 UTC m=+155.668124631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.313906 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crt6b" podStartSLOduration=134.313887395 podStartE2EDuration="2m14.313887395s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.290195847 +0000 UTC m=+155.197208668" watchObservedRunningTime="2025-09-30 17:04:14.313887395 +0000 UTC m=+155.220900226" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.325933 4772 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-stgbr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.326028 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" podUID="8423c1fe-7e8a-4848-bb77-c8ab059319fc" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.335499 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.341951 4772 patch_prober.go:28] interesting pod/router-default-5444994796-n4pdp container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.342014 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4pdp" podUID="f8e81373-125e-4a51-875e-455dd284fa9a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.361772 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:14 crc kubenswrapper[4772]: E0930 17:04:14.362840 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:14.86282323 +0000 UTC m=+155.769836061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.364286 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" event={"ID":"66cf67d4-4359-478a-93b2-91415bc629f0","Type":"ContainerStarted","Data":"205374b70a7406f0ffdd3745d0462ecd9b2885e07b43bc2acd49dcf08ad3f2d7"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.364324 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" event={"ID":"66cf67d4-4359-478a-93b2-91415bc629f0","Type":"ContainerStarted","Data":"12829b6982b5b6092e925919d8af758319bfa7a8437c0f0c070c33b9ab467613"} Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.443114 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-ppvk7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.443301 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ppvk7" podUID="2bda8593-604a-4bf9-9cd1-0d56310dd0f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.448185 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tbscw"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.453753 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.453848 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.465976 4772 patch_prober.go:28] interesting pod/console-operator-58897d9998-wlsdw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.469018 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wlsdw" podUID="44f033de-0c77-4c9e-bd73-873fe5ecce6c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.470982 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:14 crc kubenswrapper[4772]: E0930 17:04:14.480093 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:14.979311074 +0000 UTC m=+155.886323895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.487317 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-grlbs"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.513050 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.529681 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8k777"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.530430 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.533504 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.549673 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5rtbd"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.554964 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" podStartSLOduration=134.554939614 podStartE2EDuration="2m14.554939614s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.49259824 +0000 UTC m=+155.399611081" watchObservedRunningTime="2025-09-30 17:04:14.554939614 +0000 UTC m=+155.461952445" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.566003 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pkshn" podStartSLOduration=134.565980372 podStartE2EDuration="2m14.565980372s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.520451146 +0000 UTC m=+155.427463997" watchObservedRunningTime="2025-09-30 17:04:14.565980372 +0000 UTC m=+155.472993203" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.584393 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:14 crc kubenswrapper[4772]: E0930 17:04:14.584770 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.084750951 +0000 UTC m=+155.991763782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.586004 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dds9w" podStartSLOduration=134.585814379 podStartE2EDuration="2m14.585814379s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.54442348 +0000 UTC m=+155.451436311" watchObservedRunningTime="2025-09-30 17:04:14.585814379 +0000 UTC m=+155.492827210" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.588629 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" podStartSLOduration=133.588614332 podStartE2EDuration="2m13.588614332s" podCreationTimestamp="2025-09-30 17:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.572354528 +0000 UTC m=+155.479367379" watchObservedRunningTime="2025-09-30 17:04:14.588614332 +0000 UTC m=+155.495627163" Sep 30 17:04:14 crc kubenswrapper[4772]: W0930 17:04:14.618716 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab9dc342_40ca_4ac8_82eb_2c2f5c0b294c.slice/crio-a13f261d30bee989126d5a7d49dfef0a66b00c814e92b89d48be3893524a6382 WatchSource:0}: Error finding container a13f261d30bee989126d5a7d49dfef0a66b00c814e92b89d48be3893524a6382: Status 404 returned error can't find the container with id a13f261d30bee989126d5a7d49dfef0a66b00c814e92b89d48be3893524a6382 Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.638455 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tgc78" podStartSLOduration=134.638430239 podStartE2EDuration="2m14.638430239s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.630885373 +0000 UTC m=+155.537898204" watchObservedRunningTime="2025-09-30 17:04:14.638430239 +0000 UTC m=+155.545443070" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.641502 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-8rwzz" podStartSLOduration=134.641477849 podStartE2EDuration="2m14.641477849s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.601249491 +0000 UTC m=+155.508262322" watchObservedRunningTime="2025-09-30 17:04:14.641477849 +0000 UTC m=+155.548490680" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.663524 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gprnh"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.688948 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:14 crc kubenswrapper[4772]: E0930 17:04:14.689365 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.189350266 +0000 UTC m=+156.096363097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.692180 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.719160 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.764479 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-n4pdp" podStartSLOduration=134.764462173 podStartE2EDuration="2m14.764462173s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.756993388 +0000 UTC m=+155.664006229" watchObservedRunningTime="2025-09-30 17:04:14.764462173 +0000 UTC m=+155.671475004" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.782436 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.790515 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:14 crc kubenswrapper[4772]: E0930 17:04:14.790937 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.290919482 +0000 UTC m=+156.197932313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.796495 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-tqlhz" podStartSLOduration=5.796472507 podStartE2EDuration="5.796472507s" podCreationTimestamp="2025-09-30 17:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.792560535 +0000 UTC m=+155.699573396" watchObservedRunningTime="2025-09-30 17:04:14.796472507 +0000 UTC m=+155.703485338" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.797896 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr"] Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.878531 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" podStartSLOduration=133.878510774 podStartE2EDuration="2m13.878510774s" podCreationTimestamp="2025-09-30 17:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.877674302 +0000 UTC m=+155.784687133" watchObservedRunningTime="2025-09-30 17:04:14.878510774 +0000 UTC m=+155.785523605" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.879530 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-klzl8" podStartSLOduration=134.87952437 podStartE2EDuration="2m14.87952437s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.837031753 +0000 UTC m=+155.744044604" watchObservedRunningTime="2025-09-30 17:04:14.87952437 +0000 UTC m=+155.786537201" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.896881 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:14 crc kubenswrapper[4772]: E0930 17:04:14.897221 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.397208371 +0000 UTC m=+156.304221202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:14 crc kubenswrapper[4772]: W0930 17:04:14.905826 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a900ef2_a0f1_4a8b_b33a_7316c70cbaa9.slice/crio-e37690eab863c39e4a9584938d85f2edbf34227e2ff3920a9aeb065e5605e4ed WatchSource:0}: Error finding container e37690eab863c39e4a9584938d85f2edbf34227e2ff3920a9aeb065e5605e4ed: Status 404 returned error can't find the container with id e37690eab863c39e4a9584938d85f2edbf34227e2ff3920a9aeb065e5605e4ed Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.980246 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7trjg" podStartSLOduration=134.980227373 podStartE2EDuration="2m14.980227373s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.935581931 +0000 UTC m=+155.842594752" watchObservedRunningTime="2025-09-30 17:04:14.980227373 +0000 UTC m=+155.887240194" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.980936 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swt8h" podStartSLOduration=134.980930751 podStartE2EDuration="2m14.980930751s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:14.980919171 +0000 UTC m=+155.887932002" watchObservedRunningTime="2025-09-30 17:04:14.980930751 +0000 UTC m=+155.887943582" Sep 30 17:04:14 crc kubenswrapper[4772]: I0930 17:04:14.998554 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:14 crc kubenswrapper[4772]: E0930 17:04:14.998956 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.49893057 +0000 UTC m=+156.405943401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.100303 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.100834 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.600805994 +0000 UTC m=+156.507818825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.201401 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.201613 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.701584379 +0000 UTC m=+156.608597210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.202964 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.203404 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.703391556 +0000 UTC m=+156.610404387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.303831 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.303934 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.803912895 +0000 UTC m=+156.710925726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.303998 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.304443 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.804433889 +0000 UTC m=+156.711446720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.348271 4772 patch_prober.go:28] interesting pod/router-default-5444994796-n4pdp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:04:15 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Sep 30 17:04:15 crc kubenswrapper[4772]: [+]process-running ok Sep 30 17:04:15 crc kubenswrapper[4772]: healthz check failed Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.348352 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4pdp" podUID="f8e81373-125e-4a51-875e-455dd284fa9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.382326 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8k777" event={"ID":"aa56ffe7-d880-43d4-b0bb-135e1016d110","Type":"ContainerStarted","Data":"ca1c8d74f014c54ca18c3c53b861c9a0f526bd8c6d384937817797c30dad21e9"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.414357 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" event={"ID":"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4","Type":"ContainerStarted","Data":"f4b6de6f3bc5cbdc1fba6901e139438a6de714f24ff9431bcb5b3e852b9fb6d0"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.414418 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" event={"ID":"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4","Type":"ContainerStarted","Data":"e4e91750330371b554fb51e47020b4e7875c1dbeb65752cc3ea32046f22cda46"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.415808 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.418964 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.419219 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:15.919184148 +0000 UTC m=+156.826196979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.421223 4772 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ghsks container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.421290 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" podUID="c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.427413 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" event={"ID":"c6d97196-ddfc-4eff-9a79-d4c8e3698c49","Type":"ContainerStarted","Data":"69333a83b0f6349eb47d4ae118f8ac65af0997ddf3e29b58a3938fe6b5a8142f"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.427472 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" event={"ID":"c6d97196-ddfc-4eff-9a79-d4c8e3698c49","Type":"ContainerStarted","Data":"b27f8c1384257b15ce8f60bece5c7b8a169fe60cf43cef92ca005bf9167b4b43"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.447675 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" event={"ID":"e4696d05-0539-4fe2-83b4-aab389bcf124","Type":"ContainerStarted","Data":"94c237d4b753ba9612d1c3862188aee50c1f49573c2fe2bbbc916df71989fe9d"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.447718 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" event={"ID":"e4696d05-0539-4fe2-83b4-aab389bcf124","Type":"ContainerStarted","Data":"104bb811f5a58b824713458139ce7ec1ae6d7770bb7bb9c6e482e886227e87ae"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.454556 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" podStartSLOduration=135.454525559 podStartE2EDuration="2m15.454525559s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:15.448192504 +0000 UTC m=+156.355205365" watchObservedRunningTime="2025-09-30 17:04:15.454525559 +0000 UTC m=+156.361538390" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.469406 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" event={"ID":"d27083b2-f7dd-41bb-b3b1-5eae15310453","Type":"ContainerStarted","Data":"458655bbca8b0b9cd4d0d9e0b14f626f87b422a1eb1abdbf6dcacfd92f5261aa"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.469452 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" event={"ID":"d27083b2-f7dd-41bb-b3b1-5eae15310453","Type":"ContainerStarted","Data":"4d55c08b8f138c6a53b7e17a96e3f565cad0d83bf4e343705da4cb494a380688"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.475026 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4m7l" podStartSLOduration=134.474984002 podStartE2EDuration="2m14.474984002s" podCreationTimestamp="2025-09-30 17:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:15.470418463 +0000 UTC m=+156.377431294" watchObservedRunningTime="2025-09-30 17:04:15.474984002 +0000 UTC m=+156.381996833" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.495980 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gn9w" podStartSLOduration=135.495960618 podStartE2EDuration="2m15.495960618s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:15.493545725 +0000 UTC m=+156.400558566" watchObservedRunningTime="2025-09-30 17:04:15.495960618 +0000 UTC m=+156.402973439" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.497466 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" event={"ID":"cf5270f1-1160-4e12-ab8e-94f95b79ab1d","Type":"ContainerStarted","Data":"96b812cdee36ad6b76a8fa1becb90023fc9f67e94678febad35919245042a992"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.523009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.529779 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.029750158 +0000 UTC m=+156.936762989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.542443 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" event={"ID":"0c3c602c-177d-4e38-b503-b449586c6bf1","Type":"ContainerStarted","Data":"b3bd02c088dbf39792aea57ec4920546d12e516b122c058c54c35abfb234303c"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.544136 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dt94w" podStartSLOduration=135.544113983 podStartE2EDuration="2m15.544113983s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:15.543335192 +0000 UTC m=+156.450348023" watchObservedRunningTime="2025-09-30 17:04:15.544113983 +0000 UTC m=+156.451126814" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.546906 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" event={"ID":"e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2","Type":"ContainerStarted","Data":"3d3ce94235a742adec4996d5231a7ea5d48f78dc3fd0d2bc97527eabe046c6fb"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.570487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" event={"ID":"8f3d8283-7857-4e35-8cf6-bbec3d0e767e","Type":"ContainerStarted","Data":"ba1594c6199ef60819b5a3248118950a0d5967011f18ca27f32f59c68732ebb4"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.571297 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.581696 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-79czd" podStartSLOduration=135.581677391 podStartE2EDuration="2m15.581677391s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:15.579454083 +0000 UTC m=+156.486466914" watchObservedRunningTime="2025-09-30 17:04:15.581677391 +0000 UTC m=+156.488690222" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.595470 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" event={"ID":"6e905513-23f6-4e8f-95df-0668beaad53d","Type":"ContainerStarted","Data":"d9872df9fc19ca594f389f675f377249f13fd71c9350456f503b1ca3555eb295"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.625973 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.627144 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.127126825 +0000 UTC m=+157.034139646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.666260 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" event={"ID":"056fc2a2-f5db-4887-bada-a7215edd00d4","Type":"ContainerStarted","Data":"af6fa76b2a8bb51059c46d0255becdb8d7aee798acd73e5ae2bd7d95f234d8c2"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.683983 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" event={"ID":"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c","Type":"ContainerStarted","Data":"a13f261d30bee989126d5a7d49dfef0a66b00c814e92b89d48be3893524a6382"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.701947 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" event={"ID":"9a5d3cdf-3b39-4f66-a393-f0665cc68da7","Type":"ContainerStarted","Data":"afba2e912e1b72d2aba595827e55f45bb8bede1554a53f85a149093df750a321"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.729890 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.732777 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.232760427 +0000 UTC m=+157.139773258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.734379 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" event={"ID":"87d8b027-8111-447a-b6b2-da78394a12ef","Type":"ContainerStarted","Data":"4361708b7314ba35034f323b4dcae830c917f235a368767e5da076761f534924"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.734410 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" event={"ID":"87d8b027-8111-447a-b6b2-da78394a12ef","Type":"ContainerStarted","Data":"dae4575a5b96eab5e13e579478d723b5b4112ffcaffa8c63c611ef77f275ab92"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.744310 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" podStartSLOduration=135.744294137 podStartE2EDuration="2m15.744294137s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:15.628578243 +0000 UTC m=+156.535591074" watchObservedRunningTime="2025-09-30 17:04:15.744294137 +0000 UTC m=+156.651306958" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.744855 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" podStartSLOduration=135.744850772 podStartE2EDuration="2m15.744850772s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:15.723950327 +0000 UTC m=+156.630963168" watchObservedRunningTime="2025-09-30 17:04:15.744850772 +0000 UTC m=+156.651863593" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.780736 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" event={"ID":"0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9","Type":"ContainerStarted","Data":"e37690eab863c39e4a9584938d85f2edbf34227e2ff3920a9aeb065e5605e4ed"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.822157 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5rtbd" event={"ID":"e81ac593-66e9-4480-bdab-3509eb2f23ad","Type":"ContainerStarted","Data":"1f14085f7204026d05b680a8b84e49744d17be9d0b244564ccf51ae38bb2fb83"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.834203 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.834623 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.33459899 +0000 UTC m=+157.241611821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.840804 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" event={"ID":"8a68e96d-d547-4060-8ab8-c693324a4423","Type":"ContainerStarted","Data":"2cef19e4762b79b3521ec8795684147844201729bbde62514a1e59b2e40672c0"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.844100 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gprnh" event={"ID":"236d3ca8-1434-428c-a244-d6ed1ca8a299","Type":"ContainerStarted","Data":"5a26c25e9b0718769c7dd94a10bf57c04395c4e0f29dd5e9d5a2a7e10e88b810"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.846237 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" event={"ID":"66cf67d4-4359-478a-93b2-91415bc629f0","Type":"ContainerStarted","Data":"d2dd2856e91345e7d6e496c66e0f54cd135acd8d776eae692bd6fff4a73d1bfb"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.855428 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" event={"ID":"5a7e8214-29ef-48d7-aea5-bdca17750404","Type":"ContainerStarted","Data":"9b6d9bf21673b637493515bb548449c8d45264fe65cf795966679595137e6fc3"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.858025 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" event={"ID":"da0f03cd-b63a-4eed-b9bd-22260e305ef9","Type":"ContainerStarted","Data":"69f6c9a1c404fb659182eefc49e4aa6297c57c52993f50cb102baf9d43696c59"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.858079 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" event={"ID":"da0f03cd-b63a-4eed-b9bd-22260e305ef9","Type":"ContainerStarted","Data":"45ea8044123e15537dc93b10308b21938dc1f8c61920aefcb94e413b5a3fc8b4"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.885337 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-z2t2t" podStartSLOduration=135.885319091 podStartE2EDuration="2m15.885319091s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:15.881537833 +0000 UTC m=+156.788550664" watchObservedRunningTime="2025-09-30 17:04:15.885319091 +0000 UTC m=+156.792331932" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.891125 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" event={"ID":"8da1195c-0df9-4c38-b016-c71d6e7b612a","Type":"ContainerStarted","Data":"82a55f1d0c19f5522fd1ea880ba50c2d792deb08de2cf81f8559ccc252fce4e1"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.891167 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" event={"ID":"8da1195c-0df9-4c38-b016-c71d6e7b612a","Type":"ContainerStarted","Data":"d02402581061624bf45c672b611c067d23f3c65184efce0dfcd99bb5b71d2d60"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.895868 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" event={"ID":"289c39b0-d332-437c-8392-eedbf591e057","Type":"ContainerStarted","Data":"8f66a289a8e0520393b29f557d977ec1070ea8ae5adb032a70f873c75e51159e"} Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.936159 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:15 crc kubenswrapper[4772]: E0930 17:04:15.937517 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.437496351 +0000 UTC m=+157.344509242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.960161 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wlsdw" Sep 30 17:04:15 crc kubenswrapper[4772]: I0930 17:04:15.989224 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" podStartSLOduration=135.989190777 podStartE2EDuration="2m15.989190777s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:15.964830243 +0000 UTC m=+156.871843074" watchObservedRunningTime="2025-09-30 17:04:15.989190777 +0000 UTC m=+156.896203608" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.026293 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-sbg9r" podStartSLOduration=136.026271323 podStartE2EDuration="2m16.026271323s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:16.025948305 +0000 UTC m=+156.932961126" watchObservedRunningTime="2025-09-30 17:04:16.026271323 +0000 UTC m=+156.933284154" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.039084 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.039532 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.539511768 +0000 UTC m=+157.446524599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.106298 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5rtbd" podStartSLOduration=7.106274798 podStartE2EDuration="7.106274798s" podCreationTimestamp="2025-09-30 17:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:16.10444702 +0000 UTC m=+157.011459851" watchObservedRunningTime="2025-09-30 17:04:16.106274798 +0000 UTC m=+157.013287629" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.139811 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rn7gv" podStartSLOduration=136.139795541 podStartE2EDuration="2m16.139795541s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:16.138506867 +0000 UTC m=+157.045519698" watchObservedRunningTime="2025-09-30 17:04:16.139795541 +0000 UTC m=+157.046808372" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.148150 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.148547 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.648531668 +0000 UTC m=+157.555544499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.248646 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.248956 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.748942154 +0000 UTC m=+157.655954975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.269115 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sghvc" podStartSLOduration=136.269098429 podStartE2EDuration="2m16.269098429s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:16.2173159 +0000 UTC m=+157.124328731" watchObservedRunningTime="2025-09-30 17:04:16.269098429 +0000 UTC m=+157.176111260" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.344336 4772 patch_prober.go:28] interesting pod/router-default-5444994796-n4pdp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:04:16 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Sep 30 17:04:16 crc kubenswrapper[4772]: [+]process-running ok Sep 30 17:04:16 crc kubenswrapper[4772]: healthz check failed Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.344402 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4pdp" podUID="f8e81373-125e-4a51-875e-455dd284fa9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.356708 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.357216 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.857196844 +0000 UTC m=+157.764209675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.394532 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.395037 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.440807 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-stgbr" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.457467 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.457827 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.458724 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.459255 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.959230812 +0000 UTC m=+157.866243643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.459353 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.459857 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:16.959833488 +0000 UTC m=+157.866846319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.559948 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.560252 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.060210183 +0000 UTC m=+157.967223004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.661495 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.661910 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.161895332 +0000 UTC m=+158.068908163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.762931 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.763141 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.263109469 +0000 UTC m=+158.170122300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.763664 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.764046 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.264038353 +0000 UTC m=+158.171051184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.864563 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.864809 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.364774417 +0000 UTC m=+158.271787248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.864971 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.865383 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.365369813 +0000 UTC m=+158.272382644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.866930 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.903705 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5rtbd" event={"ID":"e81ac593-66e9-4480-bdab-3509eb2f23ad","Type":"ContainerStarted","Data":"bb63c083d0021b1656b1fc343e7a1d116fd57fe8cd3f57ca565bca1c284da45d"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.905475 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" event={"ID":"6e905513-23f6-4e8f-95df-0668beaad53d","Type":"ContainerStarted","Data":"2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.905858 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.906679 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" event={"ID":"cf5270f1-1160-4e12-ab8e-94f95b79ab1d","Type":"ContainerStarted","Data":"c68ec878b77811c5f82c57fa1fb3058d09366f46a791016fb61369d7e7bf84cd"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.908739 4772 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tbscw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.908783 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" podUID="6e905513-23f6-4e8f-95df-0668beaad53d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.913383 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lxhvh" event={"ID":"da0f03cd-b63a-4eed-b9bd-22260e305ef9","Type":"ContainerStarted","Data":"5830f6f2938d222e2bfa30f36d8ca26658d84bb87189ccbbcca327254c43992d"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.916173 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" event={"ID":"87d8b027-8111-447a-b6b2-da78394a12ef","Type":"ContainerStarted","Data":"4afa0be3c1550d921a1ec2e1676543fc8d5f2b1859a8458c14a2abbe141c8cda"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.919123 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" event={"ID":"8a68e96d-d547-4060-8ab8-c693324a4423","Type":"ContainerStarted","Data":"a886012b36a37b10f41a89e3bba4b398fcbb3fdddfbce03fa2123cc602c5f8b3"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.921265 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gprnh" event={"ID":"236d3ca8-1434-428c-a244-d6ed1ca8a299","Type":"ContainerStarted","Data":"1112b8a5181da0e774defc3b3ad26c32623f765fe5a31669d7ea544edea54299"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.921423 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.921491 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gprnh" event={"ID":"236d3ca8-1434-428c-a244-d6ed1ca8a299","Type":"ContainerStarted","Data":"64c1cae4e5ecb2dccf7da04cb3f87b801097e33898694a693c5a894444126860"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.923222 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" event={"ID":"0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9","Type":"ContainerStarted","Data":"79e90e106d4fe57ca98e2e7565fcb0b72b1ee5b5a5e229e05d18277247bb000a"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.924862 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" event={"ID":"5a7e8214-29ef-48d7-aea5-bdca17750404","Type":"ContainerStarted","Data":"98bfa8f88c969767b4242de0d2efaee0867563691fafbd2cdb4061e7151236d6"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.925033 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.927462 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" event={"ID":"ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c","Type":"ContainerStarted","Data":"4ab7bfbf5b27de40fb7459b71399efe3421863ecfbd675c56696ea8508066a4b"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.927630 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.927823 4772 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7g4kc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.927902 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" podUID="5a7e8214-29ef-48d7-aea5-bdca17750404" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.928998 4772 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7p9mp container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.929034 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" podUID="ab9dc342-40ca-4ac8-82eb-2c2f5c0b294c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.930291 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" event={"ID":"e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2","Type":"ContainerStarted","Data":"ce122031675c26b37a165ab0320e0e3cb044f3abebce17e7b6c67d82cb93117c"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.930354 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" event={"ID":"e3ff9cfc-f5f2-4e07-acbf-88a5a3e343a2","Type":"ContainerStarted","Data":"cfa48cde28a96ed9e3e83e0e943385a9addb6f3c703550e1a8ffb97eadca1338"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.930648 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.930965 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" podStartSLOduration=136.930945481 podStartE2EDuration="2m16.930945481s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:16.925828258 +0000 UTC m=+157.832841089" watchObservedRunningTime="2025-09-30 17:04:16.930945481 +0000 UTC m=+157.837958312" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.935311 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8k777" event={"ID":"aa56ffe7-d880-43d4-b0bb-135e1016d110","Type":"ContainerStarted","Data":"50ba9ff54d7bbf0f9178b384a6ac3ed1165df48af3806c14c281565da5936ac9"} Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.936001 4772 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ghsks container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.936149 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" podUID="c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.947897 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4x24b" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.951884 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gprnh" podStartSLOduration=7.951866876 podStartE2EDuration="7.951866876s" podCreationTimestamp="2025-09-30 17:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:16.950155692 +0000 UTC m=+157.857168543" watchObservedRunningTime="2025-09-30 17:04:16.951866876 +0000 UTC m=+157.858879707" Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.966212 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.970632 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.470578104 +0000 UTC m=+158.377590935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.971089 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:16 crc kubenswrapper[4772]: E0930 17:04:16.985322 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.485293227 +0000 UTC m=+158.392306048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:16 crc kubenswrapper[4772]: I0930 17:04:16.989268 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9bdx2" podStartSLOduration=136.9892488 podStartE2EDuration="2m16.9892488s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:16.987449793 +0000 UTC m=+157.894462624" watchObservedRunningTime="2025-09-30 17:04:16.9892488 +0000 UTC m=+157.896261621" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.062756 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" podStartSLOduration=137.062722434 podStartE2EDuration="2m17.062722434s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:17.06255675 +0000 UTC m=+157.969569581" watchObservedRunningTime="2025-09-30 17:04:17.062722434 +0000 UTC m=+157.969735265" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.080579 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" podStartSLOduration=137.080557529 podStartE2EDuration="2m17.080557529s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:17.029533059 +0000 UTC m=+157.936545890" watchObservedRunningTime="2025-09-30 17:04:17.080557529 +0000 UTC m=+157.987570360" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.082795 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-grlbs" podStartSLOduration=136.082785477 podStartE2EDuration="2m16.082785477s" podCreationTimestamp="2025-09-30 17:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:17.077556671 +0000 UTC m=+157.984569502" watchObservedRunningTime="2025-09-30 17:04:17.082785477 +0000 UTC m=+157.989798338" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.089038 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.089263 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.589242565 +0000 UTC m=+158.496255396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.089432 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.089753 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.589745588 +0000 UTC m=+158.496758419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.101200 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdmvr" podStartSLOduration=137.101177456 podStartE2EDuration="2m17.101177456s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:17.09903605 +0000 UTC m=+158.006048891" watchObservedRunningTime="2025-09-30 17:04:17.101177456 +0000 UTC m=+158.008190287" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.162214 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" podStartSLOduration=137.162190155 podStartE2EDuration="2m17.162190155s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:17.158337465 +0000 UTC m=+158.065350286" watchObservedRunningTime="2025-09-30 17:04:17.162190155 +0000 UTC m=+158.069202986" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.195220 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.195511 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.695495543 +0000 UTC m=+158.602508364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.207297 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" podStartSLOduration=137.20727891 podStartE2EDuration="2m17.20727891s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:17.204807606 +0000 UTC m=+158.111820427" watchObservedRunningTime="2025-09-30 17:04:17.20727891 +0000 UTC m=+158.114291731" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.296469 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.296964 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.796939816 +0000 UTC m=+158.703952647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.340916 4772 patch_prober.go:28] interesting pod/router-default-5444994796-n4pdp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:04:17 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Sep 30 17:04:17 crc kubenswrapper[4772]: [+]process-running ok Sep 30 17:04:17 crc kubenswrapper[4772]: healthz check failed Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.341004 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4pdp" podUID="f8e81373-125e-4a51-875e-455dd284fa9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.397916 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.398077 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.898037959 +0000 UTC m=+158.805050790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.398179 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.398482 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:17.898470391 +0000 UTC m=+158.805483222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.499937 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.500151 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.000115079 +0000 UTC m=+158.907127910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.500209 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.500556 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.00053978 +0000 UTC m=+158.907552691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.601977 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.602184 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.102156177 +0000 UTC m=+159.009169008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.602332 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.602600 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.102587898 +0000 UTC m=+159.009600729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.703353 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.703560 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.203526938 +0000 UTC m=+159.110539769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.703999 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.704407 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.20439632 +0000 UTC m=+159.111409221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.805116 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.805343 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.305304129 +0000 UTC m=+159.212316960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.805601 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.805914 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.305901015 +0000 UTC m=+159.212913846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.816189 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8cxm" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.906748 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:17 crc kubenswrapper[4772]: E0930 17:04:17.907232 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.407209534 +0000 UTC m=+159.314222375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.964800 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8k777" event={"ID":"aa56ffe7-d880-43d4-b0bb-135e1016d110","Type":"ContainerStarted","Data":"72e829c8694212f383bab5d94857a99b7058f0c42b1b46dc087d9dcdce3ae2f5"} Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.966730 4772 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tbscw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.966787 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" podUID="6e905513-23f6-4e8f-95df-0668beaad53d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.987125 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:04:17 crc kubenswrapper[4772]: I0930 17:04:17.987824 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7g4kc" Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.009253 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.016072 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.516038139 +0000 UTC m=+159.423050970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.053702 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7p9mp" Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.112628 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.112909 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.612893892 +0000 UTC m=+159.519906723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.213831 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.214830 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.714811847 +0000 UTC m=+159.621824678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.318656 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.319030 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.819013662 +0000 UTC m=+159.726026493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.345302 4772 patch_prober.go:28] interesting pod/router-default-5444994796-n4pdp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:04:18 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Sep 30 17:04:18 crc kubenswrapper[4772]: [+]process-running ok Sep 30 17:04:18 crc kubenswrapper[4772]: healthz check failed Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.345359 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4pdp" podUID="f8e81373-125e-4a51-875e-455dd284fa9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.420347 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.420801 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:18.920780943 +0000 UTC m=+159.827793844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.521737 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.521915 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.021883997 +0000 UTC m=+159.928896828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.522131 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.522454 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.022440371 +0000 UTC m=+159.929453202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.529731 4772 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jnm2b container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]log ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]etcd ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]poststarthook/max-in-flight-filter ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 30 17:04:18 crc kubenswrapper[4772]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 30 17:04:18 crc kubenswrapper[4772]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Sep 30 17:04:18 crc kubenswrapper[4772]: [+]poststarthook/project.openshift.io-projectcache ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]poststarthook/openshift.io-startinformers ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 30 17:04:18 crc kubenswrapper[4772]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 17:04:18 crc kubenswrapper[4772]: livez check failed Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.529804 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" podUID="056fc2a2-f5db-4887-bada-a7215edd00d4" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.623758 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.623929 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.123899404 +0000 UTC m=+160.030912235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.624156 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.624510 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.124496699 +0000 UTC m=+160.031509530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.725532 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.725750 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.225719026 +0000 UTC m=+160.132731857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.725881 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.726189 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.226177038 +0000 UTC m=+160.133189859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.827169 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.827395 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.327362814 +0000 UTC m=+160.234375635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.827530 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.827933 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.327915748 +0000 UTC m=+160.234928579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.928592 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.928787 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.428760906 +0000 UTC m=+160.335773737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.928910 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:18 crc kubenswrapper[4772]: E0930 17:04:18.929247 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.429239918 +0000 UTC m=+160.336252749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.973957 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8k777" event={"ID":"aa56ffe7-d880-43d4-b0bb-135e1016d110","Type":"ContainerStarted","Data":"65be69b6ba0de6ba21f63894c4cba11cba22711863f1d010216a450a1e04c441"} Sep 30 17:04:18 crc kubenswrapper[4772]: I0930 17:04:18.974004 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8k777" event={"ID":"aa56ffe7-d880-43d4-b0bb-135e1016d110","Type":"ContainerStarted","Data":"63fd40fdafe76a3417040f494841681aa2b7cd3e1630ad0fe4b6914d2d44b43b"} Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.016641 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-8k777" podStartSLOduration=10.016619744 podStartE2EDuration="10.016619744s" podCreationTimestamp="2025-09-30 17:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:19.010880535 +0000 UTC m=+159.917893366" watchObservedRunningTime="2025-09-30 17:04:19.016619744 +0000 UTC m=+159.923632575" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.029607 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.030610 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.530589388 +0000 UTC m=+160.437602229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.132075 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.132522 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.632504233 +0000 UTC m=+160.539517064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.233259 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.233480 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.733449843 +0000 UTC m=+160.640462674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.233547 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.233955 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.733938476 +0000 UTC m=+160.640951307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.303642 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ddnkh"] Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.304883 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.310760 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.320817 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ddnkh"] Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.334890 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.335130 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-catalog-content\") pod \"community-operators-ddnkh\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.335253 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-utilities\") pod \"community-operators-ddnkh\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.335302 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mfrk\" (UniqueName: \"kubernetes.io/projected/35d2ea14-6885-4373-b795-4e4714b4a2ff-kube-api-access-2mfrk\") pod \"community-operators-ddnkh\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.335429 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.835411699 +0000 UTC m=+160.742424530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.344210 4772 patch_prober.go:28] interesting pod/router-default-5444994796-n4pdp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:04:19 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Sep 30 17:04:19 crc kubenswrapper[4772]: [+]process-running ok Sep 30 17:04:19 crc kubenswrapper[4772]: healthz check failed Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.344279 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4pdp" podUID="f8e81373-125e-4a51-875e-455dd284fa9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.369311 4772 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.406488 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.407194 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.410637 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.410771 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.425550 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.437113 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-utilities\") pod \"community-operators-ddnkh\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.437175 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mfrk\" (UniqueName: \"kubernetes.io/projected/35d2ea14-6885-4373-b795-4e4714b4a2ff-kube-api-access-2mfrk\") pod \"community-operators-ddnkh\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.437216 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff14a12a-2e3a-47eb-983c-18e98788e4a6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.437279 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.437315 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-catalog-content\") pod \"community-operators-ddnkh\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.437338 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff14a12a-2e3a-47eb-983c-18e98788e4a6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.437681 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-utilities\") pod \"community-operators-ddnkh\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.437761 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:19.937744015 +0000 UTC m=+160.844756846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.437886 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-catalog-content\") pod \"community-operators-ddnkh\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.476644 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mfrk\" (UniqueName: \"kubernetes.io/projected/35d2ea14-6885-4373-b795-4e4714b4a2ff-kube-api-access-2mfrk\") pod \"community-operators-ddnkh\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.508313 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6j79n"] Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.510015 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.521018 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.537133 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6j79n"] Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.539201 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.539398 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:20.039366133 +0000 UTC m=+160.946378964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.539584 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-utilities\") pod \"certified-operators-6j79n\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.539635 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff14a12a-2e3a-47eb-983c-18e98788e4a6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.539661 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.539708 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff14a12a-2e3a-47eb-983c-18e98788e4a6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.539713 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff14a12a-2e3a-47eb-983c-18e98788e4a6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.539782 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jwrh\" (UniqueName: \"kubernetes.io/projected/8f701b8e-c15e-48f0-a732-fba005c98ff7-kube-api-access-2jwrh\") pod \"certified-operators-6j79n\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.539804 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-catalog-content\") pod \"certified-operators-6j79n\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.540010 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:20.039994909 +0000 UTC m=+160.947007740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.561897 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff14a12a-2e3a-47eb-983c-18e98788e4a6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.618304 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.641030 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.641217 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:20.141186135 +0000 UTC m=+161.048198966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.641345 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-catalog-content\") pod \"certified-operators-6j79n\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.641415 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-utilities\") pod \"certified-operators-6j79n\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.641445 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.641490 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jwrh\" (UniqueName: \"kubernetes.io/projected/8f701b8e-c15e-48f0-a732-fba005c98ff7-kube-api-access-2jwrh\") pod \"certified-operators-6j79n\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.641863 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-catalog-content\") pod \"certified-operators-6j79n\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.641921 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:20.141901384 +0000 UTC m=+161.048914215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.642157 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-utilities\") pod \"certified-operators-6j79n\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.673770 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jwrh\" (UniqueName: \"kubernetes.io/projected/8f701b8e-c15e-48f0-a732-fba005c98ff7-kube-api-access-2jwrh\") pod \"certified-operators-6j79n\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.705151 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gn4wh"] Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.710659 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.713607 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gn4wh"] Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.721362 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.742697 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.742850 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smml7\" (UniqueName: \"kubernetes.io/projected/a1577bbb-ede6-49be-b05e-09e35194cde6-kube-api-access-smml7\") pod \"community-operators-gn4wh\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.742920 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-utilities\") pod \"community-operators-gn4wh\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.742969 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-catalog-content\") pod \"community-operators-gn4wh\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.743076 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:20.243047539 +0000 UTC m=+161.150060370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.824338 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.846917 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-catalog-content\") pod \"community-operators-gn4wh\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.846961 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smml7\" (UniqueName: \"kubernetes.io/projected/a1577bbb-ede6-49be-b05e-09e35194cde6-kube-api-access-smml7\") pod \"community-operators-gn4wh\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.846989 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.847027 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-utilities\") pod \"community-operators-gn4wh\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.847946 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-utilities\") pod \"community-operators-gn4wh\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.848470 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:20.348451935 +0000 UTC m=+161.255464766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.848551 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-catalog-content\") pod \"community-operators-gn4wh\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.891146 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smml7\" (UniqueName: \"kubernetes.io/projected/a1577bbb-ede6-49be-b05e-09e35194cde6-kube-api-access-smml7\") pod \"community-operators-gn4wh\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.912894 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4xtss"] Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.913875 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.943531 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xtss"] Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.954459 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.954655 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:20.454634801 +0000 UTC m=+161.361647632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.958824 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz9tw\" (UniqueName: \"kubernetes.io/projected/34d82af0-e8df-49bb-bc8b-372ed51d7d53-kube-api-access-hz9tw\") pod \"certified-operators-4xtss\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.958896 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-utilities\") pod \"certified-operators-4xtss\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.959014 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:19 crc kubenswrapper[4772]: I0930 17:04:19.959049 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-catalog-content\") pod \"certified-operators-4xtss\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:19 crc kubenswrapper[4772]: E0930 17:04:19.959749 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:04:20.459735454 +0000 UTC m=+161.366748285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2gtql" (UID: "38933d3b-1f86-415d-923c-c8366e93021f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.005738 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ddnkh"] Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.062234 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.062546 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz9tw\" (UniqueName: \"kubernetes.io/projected/34d82af0-e8df-49bb-bc8b-372ed51d7d53-kube-api-access-hz9tw\") pod \"certified-operators-4xtss\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.062581 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-utilities\") pod \"certified-operators-4xtss\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.062668 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-catalog-content\") pod \"certified-operators-4xtss\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.063135 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-catalog-content\") pod \"certified-operators-4xtss\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:20 crc kubenswrapper[4772]: E0930 17:04:20.063208 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:04:20.563192669 +0000 UTC m=+161.470205500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.064445 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-utilities\") pod \"certified-operators-4xtss\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.094281 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz9tw\" (UniqueName: \"kubernetes.io/projected/34d82af0-e8df-49bb-bc8b-372ed51d7d53-kube-api-access-hz9tw\") pod \"certified-operators-4xtss\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.089193 4772 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T17:04:19.369334213Z","Handler":null,"Name":""} Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.117897 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.138349 4772 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.138396 4772 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.160183 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.167150 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.185991 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.186034 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.266924 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.286712 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2gtql\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.313808 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6j79n"] Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.343270 4772 patch_prober.go:28] interesting pod/router-default-5444994796-n4pdp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:04:20 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Sep 30 17:04:20 crc kubenswrapper[4772]: [+]process-running ok Sep 30 17:04:20 crc kubenswrapper[4772]: healthz check failed Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.343334 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4pdp" podUID="f8e81373-125e-4a51-875e-455dd284fa9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:04:20 crc kubenswrapper[4772]: W0930 17:04:20.360257 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f701b8e_c15e_48f0_a732_fba005c98ff7.slice/crio-4dcaf2cad859efc7378f7394a4212a3d7eda25792556ab835d44e57cba4b7a1f WatchSource:0}: Error finding container 4dcaf2cad859efc7378f7394a4212a3d7eda25792556ab835d44e57cba4b7a1f: Status 404 returned error can't find the container with id 4dcaf2cad859efc7378f7394a4212a3d7eda25792556ab835d44e57cba4b7a1f Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.369768 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.374324 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.450484 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.532615 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gn4wh"] Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.651611 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xtss"] Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.758600 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2gtql"] Sep 30 17:04:20 crc kubenswrapper[4772]: W0930 17:04:20.779079 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38933d3b_1f86_415d_923c_c8366e93021f.slice/crio-3a485eea8d8abc6ee851fd0104ed135e85d3e5e18eed7aedb14f57c9ffadfc53 WatchSource:0}: Error finding container 3a485eea8d8abc6ee851fd0104ed135e85d3e5e18eed7aedb14f57c9ffadfc53: Status 404 returned error can't find the container with id 3a485eea8d8abc6ee851fd0104ed135e85d3e5e18eed7aedb14f57c9ffadfc53 Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.988906 4772 generic.go:334] "Generic (PLEG): container finished" podID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerID="3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607" exitCode=0 Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.989302 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddnkh" event={"ID":"35d2ea14-6885-4373-b795-4e4714b4a2ff","Type":"ContainerDied","Data":"3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607"} Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.989337 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddnkh" event={"ID":"35d2ea14-6885-4373-b795-4e4714b4a2ff","Type":"ContainerStarted","Data":"b30b3f83b12646edef5a8b945323789934101d771152dfceb68f3aa267b696ba"} Sep 30 17:04:20 crc kubenswrapper[4772]: I0930 17:04:20.991979 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.001971 4772 generic.go:334] "Generic (PLEG): container finished" podID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerID="bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545" exitCode=0 Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.002133 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn4wh" event={"ID":"a1577bbb-ede6-49be-b05e-09e35194cde6","Type":"ContainerDied","Data":"bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545"} Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.002191 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn4wh" event={"ID":"a1577bbb-ede6-49be-b05e-09e35194cde6","Type":"ContainerStarted","Data":"e1fe596b9108c853f455bc42351d7ff0403aeb050b1b5b603e200154658797a9"} Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.011165 4772 generic.go:334] "Generic (PLEG): container finished" podID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerID="12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5" exitCode=0 Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.011363 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j79n" event={"ID":"8f701b8e-c15e-48f0-a732-fba005c98ff7","Type":"ContainerDied","Data":"12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5"} Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.011423 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j79n" event={"ID":"8f701b8e-c15e-48f0-a732-fba005c98ff7","Type":"ContainerStarted","Data":"4dcaf2cad859efc7378f7394a4212a3d7eda25792556ab835d44e57cba4b7a1f"} Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.013529 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" event={"ID":"38933d3b-1f86-415d-923c-c8366e93021f","Type":"ContainerStarted","Data":"3a485eea8d8abc6ee851fd0104ed135e85d3e5e18eed7aedb14f57c9ffadfc53"} Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.025356 4772 generic.go:334] "Generic (PLEG): container finished" podID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerID="0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d" exitCode=0 Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.025580 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xtss" event={"ID":"34d82af0-e8df-49bb-bc8b-372ed51d7d53","Type":"ContainerDied","Data":"0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d"} Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.025624 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xtss" event={"ID":"34d82af0-e8df-49bb-bc8b-372ed51d7d53","Type":"ContainerStarted","Data":"5d78b606977d1fef452b662772eed626924bea7ae3f9431be5e18b3180c8e77e"} Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.043716 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff14a12a-2e3a-47eb-983c-18e98788e4a6","Type":"ContainerStarted","Data":"613ecbcd9bc14cea24e755ca3286ea05b7b35c03c1269ad3fdf5a91f7414293b"} Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.043767 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff14a12a-2e3a-47eb-983c-18e98788e4a6","Type":"ContainerStarted","Data":"744cb78373542348fc55acb3e1dfbaa621227cb2f7936a02d3f3f18e3275867b"} Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.101478 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.101460637 podStartE2EDuration="2.101460637s" podCreationTimestamp="2025-09-30 17:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:21.076427975 +0000 UTC m=+161.983440806" watchObservedRunningTime="2025-09-30 17:04:21.101460637 +0000 UTC m=+162.008473468" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.338179 4772 patch_prober.go:28] interesting pod/router-default-5444994796-n4pdp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:04:21 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Sep 30 17:04:21 crc kubenswrapper[4772]: [+]process-running ok Sep 30 17:04:21 crc kubenswrapper[4772]: healthz check failed Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.338310 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4pdp" podUID="f8e81373-125e-4a51-875e-455dd284fa9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.380131 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.380333 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.381903 4772 patch_prober.go:28] interesting pod/console-f9d7485db-tm4sk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.382034 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-tm4sk" podUID="d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.461824 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.467784 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jnm2b" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.505863 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-znf69"] Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.507440 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.509256 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.520509 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-znf69"] Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.587214 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx42k\" (UniqueName: \"kubernetes.io/projected/cc49ca3c-00ac-47d2-abd4-74436beb8c45-kube-api-access-sx42k\") pod \"redhat-marketplace-znf69\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.587288 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-catalog-content\") pod \"redhat-marketplace-znf69\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.587387 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-utilities\") pod \"redhat-marketplace-znf69\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.590326 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-ppvk7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.590366 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ppvk7" podUID="2bda8593-604a-4bf9-9cd1-0d56310dd0f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.591503 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-ppvk7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.591538 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ppvk7" podUID="2bda8593-604a-4bf9-9cd1-0d56310dd0f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.689159 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx42k\" (UniqueName: \"kubernetes.io/projected/cc49ca3c-00ac-47d2-abd4-74436beb8c45-kube-api-access-sx42k\") pod \"redhat-marketplace-znf69\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.689237 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-catalog-content\") pod \"redhat-marketplace-znf69\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.689295 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-utilities\") pod \"redhat-marketplace-znf69\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.689918 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-utilities\") pod \"redhat-marketplace-znf69\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.690179 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-catalog-content\") pod \"redhat-marketplace-znf69\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.725177 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx42k\" (UniqueName: \"kubernetes.io/projected/cc49ca3c-00ac-47d2-abd4-74436beb8c45-kube-api-access-sx42k\") pod \"redhat-marketplace-znf69\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.825164 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.906294 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.919322 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pm9fz"] Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.920522 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.936271 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm9fz"] Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.994222 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnq9p\" (UniqueName: \"kubernetes.io/projected/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-kube-api-access-tnq9p\") pod \"redhat-marketplace-pm9fz\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.994284 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-utilities\") pod \"redhat-marketplace-pm9fz\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:21 crc kubenswrapper[4772]: I0930 17:04:21.994339 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-catalog-content\") pod \"redhat-marketplace-pm9fz\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.071982 4772 generic.go:334] "Generic (PLEG): container finished" podID="ff14a12a-2e3a-47eb-983c-18e98788e4a6" containerID="613ecbcd9bc14cea24e755ca3286ea05b7b35c03c1269ad3fdf5a91f7414293b" exitCode=0 Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.072880 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff14a12a-2e3a-47eb-983c-18e98788e4a6","Type":"ContainerDied","Data":"613ecbcd9bc14cea24e755ca3286ea05b7b35c03c1269ad3fdf5a91f7414293b"} Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.095727 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-catalog-content\") pod \"redhat-marketplace-pm9fz\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.095807 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnq9p\" (UniqueName: \"kubernetes.io/projected/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-kube-api-access-tnq9p\") pod \"redhat-marketplace-pm9fz\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.095834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-utilities\") pod \"redhat-marketplace-pm9fz\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.096404 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-utilities\") pod \"redhat-marketplace-pm9fz\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.098020 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-catalog-content\") pod \"redhat-marketplace-pm9fz\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.131733 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-znf69"] Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.136367 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" event={"ID":"38933d3b-1f86-415d-923c-c8366e93021f","Type":"ContainerStarted","Data":"c92641fc3906b5aa25a2e94584b352662ea536f555e071692f4ba237ba88a30a"} Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.137082 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnq9p\" (UniqueName: \"kubernetes.io/projected/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-kube-api-access-tnq9p\") pod \"redhat-marketplace-pm9fz\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.137445 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.167960 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" podStartSLOduration=142.167940869 podStartE2EDuration="2m22.167940869s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:22.167613051 +0000 UTC m=+163.074625882" watchObservedRunningTime="2025-09-30 17:04:22.167940869 +0000 UTC m=+163.074953700" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.253989 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.334675 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.338582 4772 patch_prober.go:28] interesting pod/router-default-5444994796-n4pdp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:04:22 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Sep 30 17:04:22 crc kubenswrapper[4772]: [+]process-running ok Sep 30 17:04:22 crc kubenswrapper[4772]: healthz check failed Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.338639 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4pdp" podUID="f8e81373-125e-4a51-875e-455dd284fa9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.508934 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.526022 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ntkh6"] Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.530550 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.536118 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.539459 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntkh6"] Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.603753 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wng64\" (UniqueName: \"kubernetes.io/projected/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-kube-api-access-wng64\") pod \"redhat-operators-ntkh6\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.603835 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-utilities\") pod \"redhat-operators-ntkh6\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.605690 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-catalog-content\") pod \"redhat-operators-ntkh6\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.707285 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-catalog-content\") pod \"redhat-operators-ntkh6\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.707343 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wng64\" (UniqueName: \"kubernetes.io/projected/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-kube-api-access-wng64\") pod \"redhat-operators-ntkh6\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.707371 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-utilities\") pod \"redhat-operators-ntkh6\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.707825 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-utilities\") pod \"redhat-operators-ntkh6\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.708040 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-catalog-content\") pod \"redhat-operators-ntkh6\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.729443 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wng64\" (UniqueName: \"kubernetes.io/projected/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-kube-api-access-wng64\") pod \"redhat-operators-ntkh6\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.822269 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm9fz"] Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.893397 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.901694 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sln52"] Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.903510 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.909963 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-utilities\") pod \"redhat-operators-sln52\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.910006 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-catalog-content\") pod \"redhat-operators-sln52\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.910029 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfmng\" (UniqueName: \"kubernetes.io/projected/38b62f9b-60d9-4421-860e-72eb21c7aab4-kube-api-access-zfmng\") pod \"redhat-operators-sln52\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:22 crc kubenswrapper[4772]: I0930 17:04:22.918433 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sln52"] Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.014257 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-catalog-content\") pod \"redhat-operators-sln52\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.014316 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-utilities\") pod \"redhat-operators-sln52\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.014345 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfmng\" (UniqueName: \"kubernetes.io/projected/38b62f9b-60d9-4421-860e-72eb21c7aab4-kube-api-access-zfmng\") pod \"redhat-operators-sln52\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.015469 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-catalog-content\") pod \"redhat-operators-sln52\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.016598 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-utilities\") pod \"redhat-operators-sln52\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.039079 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfmng\" (UniqueName: \"kubernetes.io/projected/38b62f9b-60d9-4421-860e-72eb21c7aab4-kube-api-access-zfmng\") pod \"redhat-operators-sln52\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.146914 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm9fz" event={"ID":"ca089976-6fee-4d9a-9e3e-48a61b6de5f1","Type":"ContainerStarted","Data":"8b51889a237490eaa75e7bd403ede266dfb23f8c3483e3555d9cba215c78df32"} Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.146972 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm9fz" event={"ID":"ca089976-6fee-4d9a-9e3e-48a61b6de5f1","Type":"ContainerStarted","Data":"19f860a20674e3c70af12ab3d84768cb0cb47a5d773c5afe05d2c84f3a3d2615"} Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.149110 4772 generic.go:334] "Generic (PLEG): container finished" podID="8a68e96d-d547-4060-8ab8-c693324a4423" containerID="a886012b36a37b10f41a89e3bba4b398fcbb3fdddfbce03fa2123cc602c5f8b3" exitCode=0 Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.149195 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" event={"ID":"8a68e96d-d547-4060-8ab8-c693324a4423","Type":"ContainerDied","Data":"a886012b36a37b10f41a89e3bba4b398fcbb3fdddfbce03fa2123cc602c5f8b3"} Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.157543 4772 generic.go:334] "Generic (PLEG): container finished" podID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerID="89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15" exitCode=0 Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.158036 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-znf69" event={"ID":"cc49ca3c-00ac-47d2-abd4-74436beb8c45","Type":"ContainerDied","Data":"89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15"} Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.158166 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-znf69" event={"ID":"cc49ca3c-00ac-47d2-abd4-74436beb8c45","Type":"ContainerStarted","Data":"c844f25dd285073b3d583e9c4b1e2a75f812581033b8fe64f5171c6bd4ed8c5f"} Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.254130 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.318688 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.324995 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2541dd-c77d-4bc5-9771-6ac741731464-metrics-certs\") pod \"network-metrics-daemon-wlgc4\" (UID: \"0f2541dd-c77d-4bc5-9771-6ac741731464\") " pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.346524 4772 patch_prober.go:28] interesting pod/router-default-5444994796-n4pdp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:04:23 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Sep 30 17:04:23 crc kubenswrapper[4772]: [+]process-running ok Sep 30 17:04:23 crc kubenswrapper[4772]: healthz check failed Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.346595 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4pdp" podUID="f8e81373-125e-4a51-875e-455dd284fa9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.414662 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntkh6"] Sep 30 17:04:23 crc kubenswrapper[4772]: W0930 17:04:23.436084 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee62041_3f62_4daa_b7cc_9b8ec568bc61.slice/crio-465d828d30ab266f47fe0a548402fbc4200fc0b6ad158c52001cc4fcc6217d51 WatchSource:0}: Error finding container 465d828d30ab266f47fe0a548402fbc4200fc0b6ad158c52001cc4fcc6217d51: Status 404 returned error can't find the container with id 465d828d30ab266f47fe0a548402fbc4200fc0b6ad158c52001cc4fcc6217d51 Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.450462 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.557159 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlgc4" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.622787 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kubelet-dir\") pod \"ff14a12a-2e3a-47eb-983c-18e98788e4a6\" (UID: \"ff14a12a-2e3a-47eb-983c-18e98788e4a6\") " Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.622864 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kube-api-access\") pod \"ff14a12a-2e3a-47eb-983c-18e98788e4a6\" (UID: \"ff14a12a-2e3a-47eb-983c-18e98788e4a6\") " Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.623025 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ff14a12a-2e3a-47eb-983c-18e98788e4a6" (UID: "ff14a12a-2e3a-47eb-983c-18e98788e4a6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.623227 4772 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.630252 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ff14a12a-2e3a-47eb-983c-18e98788e4a6" (UID: "ff14a12a-2e3a-47eb-983c-18e98788e4a6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.724165 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff14a12a-2e3a-47eb-983c-18e98788e4a6-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.832211 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sln52"] Sep 30 17:04:23 crc kubenswrapper[4772]: W0930 17:04:23.853704 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b62f9b_60d9_4421_860e_72eb21c7aab4.slice/crio-b268ce6886581b53badd56cb9ef956f6c800411a6059c79a0c1904801e735204 WatchSource:0}: Error finding container b268ce6886581b53badd56cb9ef956f6c800411a6059c79a0c1904801e735204: Status 404 returned error can't find the container with id b268ce6886581b53badd56cb9ef956f6c800411a6059c79a0c1904801e735204 Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.894759 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 17:04:23 crc kubenswrapper[4772]: E0930 17:04:23.895292 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff14a12a-2e3a-47eb-983c-18e98788e4a6" containerName="pruner" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.895316 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff14a12a-2e3a-47eb-983c-18e98788e4a6" containerName="pruner" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.895514 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff14a12a-2e3a-47eb-983c-18e98788e4a6" containerName="pruner" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.896003 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.900595 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.900886 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.911999 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 17:04:23 crc kubenswrapper[4772]: I0930 17:04:23.928786 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wlgc4"] Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.027233 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.027580 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.128770 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.128917 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.129093 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.150637 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.191868 4772 generic.go:334] "Generic (PLEG): container finished" podID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerID="8b51889a237490eaa75e7bd403ede266dfb23f8c3483e3555d9cba215c78df32" exitCode=0 Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.191968 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm9fz" event={"ID":"ca089976-6fee-4d9a-9e3e-48a61b6de5f1","Type":"ContainerDied","Data":"8b51889a237490eaa75e7bd403ede266dfb23f8c3483e3555d9cba215c78df32"} Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.196886 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sln52" event={"ID":"38b62f9b-60d9-4421-860e-72eb21c7aab4","Type":"ContainerStarted","Data":"b268ce6886581b53badd56cb9ef956f6c800411a6059c79a0c1904801e735204"} Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.200410 4772 generic.go:334] "Generic (PLEG): container finished" podID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerID="8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0" exitCode=0 Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.200516 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntkh6" event={"ID":"4ee62041-3f62-4daa-b7cc-9b8ec568bc61","Type":"ContainerDied","Data":"8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0"} Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.200552 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntkh6" event={"ID":"4ee62041-3f62-4daa-b7cc-9b8ec568bc61","Type":"ContainerStarted","Data":"465d828d30ab266f47fe0a548402fbc4200fc0b6ad158c52001cc4fcc6217d51"} Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.213190 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.213183 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff14a12a-2e3a-47eb-983c-18e98788e4a6","Type":"ContainerDied","Data":"744cb78373542348fc55acb3e1dfbaa621227cb2f7936a02d3f3f18e3275867b"} Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.213306 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="744cb78373542348fc55acb3e1dfbaa621227cb2f7936a02d3f3f18e3275867b" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.223401 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" event={"ID":"0f2541dd-c77d-4bc5-9771-6ac741731464","Type":"ContainerStarted","Data":"c2caba9c5c6cb84f3d2afc06eab918fa15916c8a0740b3715803efad41b05a9c"} Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.291680 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.342248 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.349813 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-n4pdp" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.484367 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.645687 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a68e96d-d547-4060-8ab8-c693324a4423-config-volume\") pod \"8a68e96d-d547-4060-8ab8-c693324a4423\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.645779 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz9s7\" (UniqueName: \"kubernetes.io/projected/8a68e96d-d547-4060-8ab8-c693324a4423-kube-api-access-xz9s7\") pod \"8a68e96d-d547-4060-8ab8-c693324a4423\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.645822 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a68e96d-d547-4060-8ab8-c693324a4423-secret-volume\") pod \"8a68e96d-d547-4060-8ab8-c693324a4423\" (UID: \"8a68e96d-d547-4060-8ab8-c693324a4423\") " Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.646884 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a68e96d-d547-4060-8ab8-c693324a4423-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a68e96d-d547-4060-8ab8-c693324a4423" (UID: "8a68e96d-d547-4060-8ab8-c693324a4423"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.650877 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a68e96d-d547-4060-8ab8-c693324a4423-kube-api-access-xz9s7" (OuterVolumeSpecName: "kube-api-access-xz9s7") pod "8a68e96d-d547-4060-8ab8-c693324a4423" (UID: "8a68e96d-d547-4060-8ab8-c693324a4423"). InnerVolumeSpecName "kube-api-access-xz9s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.651970 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a68e96d-d547-4060-8ab8-c693324a4423-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a68e96d-d547-4060-8ab8-c693324a4423" (UID: "8a68e96d-d547-4060-8ab8-c693324a4423"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.747968 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a68e96d-d547-4060-8ab8-c693324a4423-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.748006 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a68e96d-d547-4060-8ab8-c693324a4423-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.748017 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz9s7\" (UniqueName: \"kubernetes.io/projected/8a68e96d-d547-4060-8ab8-c693324a4423-kube-api-access-xz9s7\") on node \"crc\" DevicePath \"\"" Sep 30 17:04:24 crc kubenswrapper[4772]: I0930 17:04:24.783813 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 17:04:25 crc kubenswrapper[4772]: I0930 17:04:25.262988 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" event={"ID":"0f2541dd-c77d-4bc5-9771-6ac741731464","Type":"ContainerStarted","Data":"193f0f21e278ae0ec62515dfc424afdc0ea5bb90f454a0dccc432abc85c7c66e"} Sep 30 17:04:25 crc kubenswrapper[4772]: I0930 17:04:25.269301 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0","Type":"ContainerStarted","Data":"bde7c0f8d320fa1d8dcfb5747c0db73d033e7cd2a3f4ec55ff53df73b52a060b"} Sep 30 17:04:25 crc kubenswrapper[4772]: I0930 17:04:25.281240 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" event={"ID":"8a68e96d-d547-4060-8ab8-c693324a4423","Type":"ContainerDied","Data":"2cef19e4762b79b3521ec8795684147844201729bbde62514a1e59b2e40672c0"} Sep 30 17:04:25 crc kubenswrapper[4772]: I0930 17:04:25.281296 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cef19e4762b79b3521ec8795684147844201729bbde62514a1e59b2e40672c0" Sep 30 17:04:25 crc kubenswrapper[4772]: I0930 17:04:25.281401 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc" Sep 30 17:04:25 crc kubenswrapper[4772]: I0930 17:04:25.291240 4772 generic.go:334] "Generic (PLEG): container finished" podID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerID="34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92" exitCode=0 Sep 30 17:04:25 crc kubenswrapper[4772]: I0930 17:04:25.292388 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sln52" event={"ID":"38b62f9b-60d9-4421-860e-72eb21c7aab4","Type":"ContainerDied","Data":"34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92"} Sep 30 17:04:26 crc kubenswrapper[4772]: I0930 17:04:26.328774 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wlgc4" event={"ID":"0f2541dd-c77d-4bc5-9771-6ac741731464","Type":"ContainerStarted","Data":"1c3a51222bdcaa74f7954908176ab68389e6ea6d1b0f8b57c6125ac2358b755b"} Sep 30 17:04:26 crc kubenswrapper[4772]: I0930 17:04:26.336029 4772 generic.go:334] "Generic (PLEG): container finished" podID="6eaebe8c-bc5b-49e6-8505-1cc00258b9c0" containerID="3c7c1e95f0af9fdfe9b95cc3e72af6479b40080991313c82f064bb05e0725115" exitCode=0 Sep 30 17:04:26 crc kubenswrapper[4772]: I0930 17:04:26.336107 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0","Type":"ContainerDied","Data":"3c7c1e95f0af9fdfe9b95cc3e72af6479b40080991313c82f064bb05e0725115"} Sep 30 17:04:26 crc kubenswrapper[4772]: I0930 17:04:26.356152 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wlgc4" podStartSLOduration=146.356121195 podStartE2EDuration="2m26.356121195s" podCreationTimestamp="2025-09-30 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:04:26.349270657 +0000 UTC m=+167.256283488" watchObservedRunningTime="2025-09-30 17:04:26.356121195 +0000 UTC m=+167.263134026" Sep 30 17:04:27 crc kubenswrapper[4772]: I0930 17:04:27.549791 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gprnh" Sep 30 17:04:27 crc kubenswrapper[4772]: I0930 17:04:27.712577 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:04:27 crc kubenswrapper[4772]: I0930 17:04:27.744934 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kubelet-dir\") pod \"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0\" (UID: \"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0\") " Sep 30 17:04:27 crc kubenswrapper[4772]: I0930 17:04:27.750857 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6eaebe8c-bc5b-49e6-8505-1cc00258b9c0" (UID: "6eaebe8c-bc5b-49e6-8505-1cc00258b9c0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:04:27 crc kubenswrapper[4772]: I0930 17:04:27.751294 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kube-api-access\") pod \"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0\" (UID: \"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0\") " Sep 30 17:04:27 crc kubenswrapper[4772]: I0930 17:04:27.758278 4772 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 17:04:27 crc kubenswrapper[4772]: I0930 17:04:27.789270 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6eaebe8c-bc5b-49e6-8505-1cc00258b9c0" (UID: "6eaebe8c-bc5b-49e6-8505-1cc00258b9c0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:04:27 crc kubenswrapper[4772]: I0930 17:04:27.860269 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6eaebe8c-bc5b-49e6-8505-1cc00258b9c0-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:04:28 crc kubenswrapper[4772]: I0930 17:04:28.387834 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6eaebe8c-bc5b-49e6-8505-1cc00258b9c0","Type":"ContainerDied","Data":"bde7c0f8d320fa1d8dcfb5747c0db73d033e7cd2a3f4ec55ff53df73b52a060b"} Sep 30 17:04:28 crc kubenswrapper[4772]: I0930 17:04:28.387882 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bde7c0f8d320fa1d8dcfb5747c0db73d033e7cd2a3f4ec55ff53df73b52a060b" Sep 30 17:04:28 crc kubenswrapper[4772]: I0930 17:04:28.387987 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:04:31 crc kubenswrapper[4772]: I0930 17:04:31.394443 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:31 crc kubenswrapper[4772]: I0930 17:04:31.399411 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:04:31 crc kubenswrapper[4772]: I0930 17:04:31.596691 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ppvk7" Sep 30 17:04:38 crc kubenswrapper[4772]: I0930 17:04:38.655595 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:04:38 crc kubenswrapper[4772]: I0930 17:04:38.655883 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:04:40 crc kubenswrapper[4772]: I0930 17:04:40.465268 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:04:47 crc kubenswrapper[4772]: I0930 17:04:47.979597 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:04:52 crc kubenswrapper[4772]: I0930 17:04:52.504228 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mj4hj" Sep 30 17:04:56 crc kubenswrapper[4772]: E0930 17:04:56.305677 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 17:04:56 crc kubenswrapper[4772]: E0930 17:04:56.306783 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sx42k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-znf69_openshift-marketplace(cc49ca3c-00ac-47d2-abd4-74436beb8c45): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:04:56 crc kubenswrapper[4772]: E0930 17:04:56.307947 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-znf69" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.683480 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-znf69" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.781971 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.782176 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mfrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ddnkh_openshift-marketplace(35d2ea14-6885-4373-b795-4e4714b4a2ff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.783378 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ddnkh" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.801672 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.801808 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tnq9p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-pm9fz_openshift-marketplace(ca089976-6fee-4d9a-9e3e-48a61b6de5f1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.803194 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-pm9fz" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.819500 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.819675 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-smml7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gn4wh_openshift-marketplace(a1577bbb-ede6-49be-b05e-09e35194cde6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.821029 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gn4wh" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.837939 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.838084 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hz9tw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-4xtss_openshift-marketplace(34d82af0-e8df-49bb-bc8b-372ed51d7d53): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:04:57 crc kubenswrapper[4772]: E0930 17:04:57.839291 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-4xtss" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" Sep 30 17:04:58 crc kubenswrapper[4772]: I0930 17:04:58.566962 4772 generic.go:334] "Generic (PLEG): container finished" podID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerID="107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f" exitCode=0 Sep 30 17:04:58 crc kubenswrapper[4772]: I0930 17:04:58.567087 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntkh6" event={"ID":"4ee62041-3f62-4daa-b7cc-9b8ec568bc61","Type":"ContainerDied","Data":"107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f"} Sep 30 17:04:58 crc kubenswrapper[4772]: I0930 17:04:58.571871 4772 generic.go:334] "Generic (PLEG): container finished" podID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerID="3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435" exitCode=0 Sep 30 17:04:58 crc kubenswrapper[4772]: I0930 17:04:58.571967 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j79n" event={"ID":"8f701b8e-c15e-48f0-a732-fba005c98ff7","Type":"ContainerDied","Data":"3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435"} Sep 30 17:04:58 crc kubenswrapper[4772]: I0930 17:04:58.574029 4772 generic.go:334] "Generic (PLEG): container finished" podID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerID="dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee" exitCode=0 Sep 30 17:04:58 crc kubenswrapper[4772]: I0930 17:04:58.574095 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sln52" event={"ID":"38b62f9b-60d9-4421-860e-72eb21c7aab4","Type":"ContainerDied","Data":"dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee"} Sep 30 17:04:58 crc kubenswrapper[4772]: E0930 17:04:58.577897 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-4xtss" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" Sep 30 17:04:58 crc kubenswrapper[4772]: E0930 17:04:58.577974 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-pm9fz" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" Sep 30 17:04:58 crc kubenswrapper[4772]: E0930 17:04:58.578025 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ddnkh" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" Sep 30 17:04:58 crc kubenswrapper[4772]: E0930 17:04:58.578346 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gn4wh" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" Sep 30 17:04:59 crc kubenswrapper[4772]: I0930 17:04:59.584586 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntkh6" event={"ID":"4ee62041-3f62-4daa-b7cc-9b8ec568bc61","Type":"ContainerStarted","Data":"31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710"} Sep 30 17:04:59 crc kubenswrapper[4772]: I0930 17:04:59.589139 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j79n" event={"ID":"8f701b8e-c15e-48f0-a732-fba005c98ff7","Type":"ContainerStarted","Data":"4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8"} Sep 30 17:04:59 crc kubenswrapper[4772]: I0930 17:04:59.606579 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sln52" event={"ID":"38b62f9b-60d9-4421-860e-72eb21c7aab4","Type":"ContainerStarted","Data":"b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137"} Sep 30 17:04:59 crc kubenswrapper[4772]: I0930 17:04:59.614146 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ntkh6" podStartSLOduration=2.862663536 podStartE2EDuration="37.614124582s" podCreationTimestamp="2025-09-30 17:04:22 +0000 UTC" firstStartedPulling="2025-09-30 17:04:24.202648446 +0000 UTC m=+165.109661277" lastFinishedPulling="2025-09-30 17:04:58.954109492 +0000 UTC m=+199.861122323" observedRunningTime="2025-09-30 17:04:59.61018938 +0000 UTC m=+200.517202231" watchObservedRunningTime="2025-09-30 17:04:59.614124582 +0000 UTC m=+200.521137423" Sep 30 17:04:59 crc kubenswrapper[4772]: I0930 17:04:59.644167 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6j79n" podStartSLOduration=2.681939926 podStartE2EDuration="40.644133727s" podCreationTimestamp="2025-09-30 17:04:19 +0000 UTC" firstStartedPulling="2025-09-30 17:04:21.014662236 +0000 UTC m=+161.921675057" lastFinishedPulling="2025-09-30 17:04:58.976856027 +0000 UTC m=+199.883868858" observedRunningTime="2025-09-30 17:04:59.642388931 +0000 UTC m=+200.549401762" watchObservedRunningTime="2025-09-30 17:04:59.644133727 +0000 UTC m=+200.551146578" Sep 30 17:04:59 crc kubenswrapper[4772]: I0930 17:04:59.663462 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sln52" podStartSLOduration=3.889614604 podStartE2EDuration="37.663445522s" podCreationTimestamp="2025-09-30 17:04:22 +0000 UTC" firstStartedPulling="2025-09-30 17:04:25.29338531 +0000 UTC m=+166.200398141" lastFinishedPulling="2025-09-30 17:04:59.067216228 +0000 UTC m=+199.974229059" observedRunningTime="2025-09-30 17:04:59.66032835 +0000 UTC m=+200.567341181" watchObservedRunningTime="2025-09-30 17:04:59.663445522 +0000 UTC m=+200.570458353" Sep 30 17:04:59 crc kubenswrapper[4772]: I0930 17:04:59.825774 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:04:59 crc kubenswrapper[4772]: I0930 17:04:59.825854 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:05:01 crc kubenswrapper[4772]: I0930 17:05:01.039556 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6j79n" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerName="registry-server" probeResult="failure" output=< Sep 30 17:05:01 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Sep 30 17:05:01 crc kubenswrapper[4772]: > Sep 30 17:05:02 crc kubenswrapper[4772]: I0930 17:05:02.894288 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:05:02 crc kubenswrapper[4772]: I0930 17:05:02.894376 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:05:03 crc kubenswrapper[4772]: I0930 17:05:03.255147 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:05:03 crc kubenswrapper[4772]: I0930 17:05:03.255217 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:05:03 crc kubenswrapper[4772]: I0930 17:05:03.934859 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ntkh6" podUID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerName="registry-server" probeResult="failure" output=< Sep 30 17:05:03 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Sep 30 17:05:03 crc kubenswrapper[4772]: > Sep 30 17:05:04 crc kubenswrapper[4772]: I0930 17:05:04.293016 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sln52" podUID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerName="registry-server" probeResult="failure" output=< Sep 30 17:05:04 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Sep 30 17:05:04 crc kubenswrapper[4772]: > Sep 30 17:05:08 crc kubenswrapper[4772]: I0930 17:05:08.655246 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:05:08 crc kubenswrapper[4772]: I0930 17:05:08.655360 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:05:08 crc kubenswrapper[4772]: I0930 17:05:08.655454 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:05:08 crc kubenswrapper[4772]: I0930 17:05:08.656623 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:05:08 crc kubenswrapper[4772]: I0930 17:05:08.657472 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816" gracePeriod=600 Sep 30 17:05:09 crc kubenswrapper[4772]: I0930 17:05:09.684742 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816" exitCode=0 Sep 30 17:05:09 crc kubenswrapper[4772]: I0930 17:05:09.684852 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816"} Sep 30 17:05:09 crc kubenswrapper[4772]: I0930 17:05:09.685523 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"f10d5a8a8c6ce091c5f99aa8d4034ddb62a154de5d305c39bd7a051c8a0375f6"} Sep 30 17:05:09 crc kubenswrapper[4772]: I0930 17:05:09.937352 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:05:09 crc kubenswrapper[4772]: I0930 17:05:09.995144 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:05:11 crc kubenswrapper[4772]: I0930 17:05:11.701046 4772 generic.go:334] "Generic (PLEG): container finished" podID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerID="1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40" exitCode=0 Sep 30 17:05:11 crc kubenswrapper[4772]: I0930 17:05:11.701127 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddnkh" event={"ID":"35d2ea14-6885-4373-b795-4e4714b4a2ff","Type":"ContainerDied","Data":"1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40"} Sep 30 17:05:11 crc kubenswrapper[4772]: I0930 17:05:11.704543 4772 generic.go:334] "Generic (PLEG): container finished" podID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerID="6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1" exitCode=0 Sep 30 17:05:11 crc kubenswrapper[4772]: I0930 17:05:11.704616 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-znf69" event={"ID":"cc49ca3c-00ac-47d2-abd4-74436beb8c45","Type":"ContainerDied","Data":"6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1"} Sep 30 17:05:12 crc kubenswrapper[4772]: I0930 17:05:12.711869 4772 generic.go:334] "Generic (PLEG): container finished" podID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerID="dcb43f4b2c0fa1bd7d6ca67ab44973202cd880bb26b302ce399aa4b38139d7e5" exitCode=0 Sep 30 17:05:12 crc kubenswrapper[4772]: I0930 17:05:12.711959 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm9fz" event={"ID":"ca089976-6fee-4d9a-9e3e-48a61b6de5f1","Type":"ContainerDied","Data":"dcb43f4b2c0fa1bd7d6ca67ab44973202cd880bb26b302ce399aa4b38139d7e5"} Sep 30 17:05:12 crc kubenswrapper[4772]: I0930 17:05:12.714779 4772 generic.go:334] "Generic (PLEG): container finished" podID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerID="a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240" exitCode=0 Sep 30 17:05:12 crc kubenswrapper[4772]: I0930 17:05:12.714840 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn4wh" event={"ID":"a1577bbb-ede6-49be-b05e-09e35194cde6","Type":"ContainerDied","Data":"a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240"} Sep 30 17:05:12 crc kubenswrapper[4772]: I0930 17:05:12.717662 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddnkh" event={"ID":"35d2ea14-6885-4373-b795-4e4714b4a2ff","Type":"ContainerStarted","Data":"2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506"} Sep 30 17:05:12 crc kubenswrapper[4772]: I0930 17:05:12.723304 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-znf69" event={"ID":"cc49ca3c-00ac-47d2-abd4-74436beb8c45","Type":"ContainerStarted","Data":"eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c"} Sep 30 17:05:12 crc kubenswrapper[4772]: I0930 17:05:12.761641 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ddnkh" podStartSLOduration=2.649073158 podStartE2EDuration="53.761620384s" podCreationTimestamp="2025-09-30 17:04:19 +0000 UTC" firstStartedPulling="2025-09-30 17:04:20.990538487 +0000 UTC m=+161.897551328" lastFinishedPulling="2025-09-30 17:05:12.103085723 +0000 UTC m=+213.010098554" observedRunningTime="2025-09-30 17:05:12.758943454 +0000 UTC m=+213.665956285" watchObservedRunningTime="2025-09-30 17:05:12.761620384 +0000 UTC m=+213.668633215" Sep 30 17:05:12 crc kubenswrapper[4772]: I0930 17:05:12.800392 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-znf69" podStartSLOduration=2.752797256 podStartE2EDuration="51.800369077s" podCreationTimestamp="2025-09-30 17:04:21 +0000 UTC" firstStartedPulling="2025-09-30 17:04:23.164339376 +0000 UTC m=+164.071352217" lastFinishedPulling="2025-09-30 17:05:12.211911207 +0000 UTC m=+213.118924038" observedRunningTime="2025-09-30 17:05:12.798942499 +0000 UTC m=+213.705955330" watchObservedRunningTime="2025-09-30 17:05:12.800369077 +0000 UTC m=+213.707381908" Sep 30 17:05:12 crc kubenswrapper[4772]: I0930 17:05:12.940669 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:05:12 crc kubenswrapper[4772]: I0930 17:05:12.980017 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:05:13 crc kubenswrapper[4772]: I0930 17:05:13.326523 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:05:13 crc kubenswrapper[4772]: I0930 17:05:13.372678 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:05:13 crc kubenswrapper[4772]: I0930 17:05:13.731269 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm9fz" event={"ID":"ca089976-6fee-4d9a-9e3e-48a61b6de5f1","Type":"ContainerStarted","Data":"21b51b2bcd0279c98a79bc482793b829c78919539c4dfdd07df8ff344a13f444"} Sep 30 17:05:13 crc kubenswrapper[4772]: I0930 17:05:13.734624 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn4wh" event={"ID":"a1577bbb-ede6-49be-b05e-09e35194cde6","Type":"ContainerStarted","Data":"50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365"} Sep 30 17:05:13 crc kubenswrapper[4772]: I0930 17:05:13.816730 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pm9fz" podStartSLOduration=3.883671546 podStartE2EDuration="52.8167049s" podCreationTimestamp="2025-09-30 17:04:21 +0000 UTC" firstStartedPulling="2025-09-30 17:04:24.193741844 +0000 UTC m=+165.100754675" lastFinishedPulling="2025-09-30 17:05:13.126775208 +0000 UTC m=+214.033788029" observedRunningTime="2025-09-30 17:05:13.812447608 +0000 UTC m=+214.719460439" watchObservedRunningTime="2025-09-30 17:05:13.8167049 +0000 UTC m=+214.723717741" Sep 30 17:05:13 crc kubenswrapper[4772]: I0930 17:05:13.843821 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gn4wh" podStartSLOduration=2.644884879 podStartE2EDuration="54.843799458s" podCreationTimestamp="2025-09-30 17:04:19 +0000 UTC" firstStartedPulling="2025-09-30 17:04:21.003641219 +0000 UTC m=+161.910654060" lastFinishedPulling="2025-09-30 17:05:13.202555808 +0000 UTC m=+214.109568639" observedRunningTime="2025-09-30 17:05:13.842077033 +0000 UTC m=+214.749089864" watchObservedRunningTime="2025-09-30 17:05:13.843799458 +0000 UTC m=+214.750812299" Sep 30 17:05:14 crc kubenswrapper[4772]: I0930 17:05:14.742873 4772 generic.go:334] "Generic (PLEG): container finished" podID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerID="d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606" exitCode=0 Sep 30 17:05:14 crc kubenswrapper[4772]: I0930 17:05:14.742991 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xtss" event={"ID":"34d82af0-e8df-49bb-bc8b-372ed51d7d53","Type":"ContainerDied","Data":"d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606"} Sep 30 17:05:15 crc kubenswrapper[4772]: I0930 17:05:15.749921 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xtss" event={"ID":"34d82af0-e8df-49bb-bc8b-372ed51d7d53","Type":"ContainerStarted","Data":"b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c"} Sep 30 17:05:15 crc kubenswrapper[4772]: I0930 17:05:15.767988 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4xtss" podStartSLOduration=2.6345857070000003 podStartE2EDuration="56.76796843s" podCreationTimestamp="2025-09-30 17:04:19 +0000 UTC" firstStartedPulling="2025-09-30 17:04:21.02941756 +0000 UTC m=+161.936430391" lastFinishedPulling="2025-09-30 17:05:15.162800263 +0000 UTC m=+216.069813114" observedRunningTime="2025-09-30 17:05:15.765717791 +0000 UTC m=+216.672730642" watchObservedRunningTime="2025-09-30 17:05:15.76796843 +0000 UTC m=+216.674981261" Sep 30 17:05:17 crc kubenswrapper[4772]: I0930 17:05:17.343982 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sln52"] Sep 30 17:05:17 crc kubenswrapper[4772]: I0930 17:05:17.344524 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sln52" podUID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerName="registry-server" containerID="cri-o://b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137" gracePeriod=2 Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.558438 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.623491 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfmng\" (UniqueName: \"kubernetes.io/projected/38b62f9b-60d9-4421-860e-72eb21c7aab4-kube-api-access-zfmng\") pod \"38b62f9b-60d9-4421-860e-72eb21c7aab4\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.623578 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-catalog-content\") pod \"38b62f9b-60d9-4421-860e-72eb21c7aab4\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.623644 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-utilities\") pod \"38b62f9b-60d9-4421-860e-72eb21c7aab4\" (UID: \"38b62f9b-60d9-4421-860e-72eb21c7aab4\") " Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.624611 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-utilities" (OuterVolumeSpecName: "utilities") pod "38b62f9b-60d9-4421-860e-72eb21c7aab4" (UID: "38b62f9b-60d9-4421-860e-72eb21c7aab4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.629714 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b62f9b-60d9-4421-860e-72eb21c7aab4-kube-api-access-zfmng" (OuterVolumeSpecName: "kube-api-access-zfmng") pod "38b62f9b-60d9-4421-860e-72eb21c7aab4" (UID: "38b62f9b-60d9-4421-860e-72eb21c7aab4"). InnerVolumeSpecName "kube-api-access-zfmng". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.722863 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38b62f9b-60d9-4421-860e-72eb21c7aab4" (UID: "38b62f9b-60d9-4421-860e-72eb21c7aab4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.725527 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfmng\" (UniqueName: \"kubernetes.io/projected/38b62f9b-60d9-4421-860e-72eb21c7aab4-kube-api-access-zfmng\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.725568 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.725582 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b62f9b-60d9-4421-860e-72eb21c7aab4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.771091 4772 generic.go:334] "Generic (PLEG): container finished" podID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerID="b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137" exitCode=0 Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.771144 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sln52" event={"ID":"38b62f9b-60d9-4421-860e-72eb21c7aab4","Type":"ContainerDied","Data":"b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137"} Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.771178 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sln52" event={"ID":"38b62f9b-60d9-4421-860e-72eb21c7aab4","Type":"ContainerDied","Data":"b268ce6886581b53badd56cb9ef956f6c800411a6059c79a0c1904801e735204"} Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.771200 4772 scope.go:117] "RemoveContainer" containerID="b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.771631 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sln52" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.789332 4772 scope.go:117] "RemoveContainer" containerID="dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.806980 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sln52"] Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.809837 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sln52"] Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.826219 4772 scope.go:117] "RemoveContainer" containerID="34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.839553 4772 scope.go:117] "RemoveContainer" containerID="b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137" Sep 30 17:05:18 crc kubenswrapper[4772]: E0930 17:05:18.840093 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137\": container with ID starting with b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137 not found: ID does not exist" containerID="b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.840143 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137"} err="failed to get container status \"b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137\": rpc error: code = NotFound desc = could not find container \"b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137\": container with ID starting with b76e4d9dd557d38c6068ac0d1acc7f43fa9d6ab0a8c6b8cf6c63b3960bbc2137 not found: ID does not exist" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.840174 4772 scope.go:117] "RemoveContainer" containerID="dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee" Sep 30 17:05:18 crc kubenswrapper[4772]: E0930 17:05:18.840518 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee\": container with ID starting with dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee not found: ID does not exist" containerID="dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.840555 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee"} err="failed to get container status \"dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee\": rpc error: code = NotFound desc = could not find container \"dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee\": container with ID starting with dc0ba9c56159302f47e0283adebf8862ee08e6f85f34776915c6b6c4a76221ee not found: ID does not exist" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.840584 4772 scope.go:117] "RemoveContainer" containerID="34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92" Sep 30 17:05:18 crc kubenswrapper[4772]: E0930 17:05:18.841961 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92\": container with ID starting with 34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92 not found: ID does not exist" containerID="34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92" Sep 30 17:05:18 crc kubenswrapper[4772]: I0930 17:05:18.842000 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92"} err="failed to get container status \"34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92\": rpc error: code = NotFound desc = could not find container \"34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92\": container with ID starting with 34ce2e3a924bfa5011a45dfa4d49d023e34ec533dd978649c011ea35fc088a92 not found: ID does not exist" Sep 30 17:05:19 crc kubenswrapper[4772]: I0930 17:05:19.618920 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:05:19 crc kubenswrapper[4772]: I0930 17:05:19.618987 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:05:19 crc kubenswrapper[4772]: I0930 17:05:19.669070 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:05:19 crc kubenswrapper[4772]: I0930 17:05:19.831702 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:05:19 crc kubenswrapper[4772]: I0930 17:05:19.905238 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b62f9b-60d9-4421-860e-72eb21c7aab4" path="/var/lib/kubelet/pods/38b62f9b-60d9-4421-860e-72eb21c7aab4/volumes" Sep 30 17:05:20 crc kubenswrapper[4772]: I0930 17:05:20.119218 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:05:20 crc kubenswrapper[4772]: I0930 17:05:20.119253 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:05:20 crc kubenswrapper[4772]: I0930 17:05:20.174476 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:05:20 crc kubenswrapper[4772]: I0930 17:05:20.267871 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:05:20 crc kubenswrapper[4772]: I0930 17:05:20.267918 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:05:20 crc kubenswrapper[4772]: I0930 17:05:20.309621 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:05:20 crc kubenswrapper[4772]: I0930 17:05:20.831590 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:05:20 crc kubenswrapper[4772]: I0930 17:05:20.840187 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:05:21 crc kubenswrapper[4772]: I0930 17:05:21.826279 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:05:21 crc kubenswrapper[4772]: I0930 17:05:21.826323 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:05:21 crc kubenswrapper[4772]: I0930 17:05:21.870199 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:05:22 crc kubenswrapper[4772]: I0930 17:05:22.255126 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:05:22 crc kubenswrapper[4772]: I0930 17:05:22.255569 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:05:22 crc kubenswrapper[4772]: I0930 17:05:22.317241 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:05:22 crc kubenswrapper[4772]: I0930 17:05:22.849020 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:05:22 crc kubenswrapper[4772]: I0930 17:05:22.855380 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.144762 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gn4wh"] Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.145079 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gn4wh" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerName="registry-server" containerID="cri-o://50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365" gracePeriod=2 Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.493221 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.594200 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-catalog-content\") pod \"a1577bbb-ede6-49be-b05e-09e35194cde6\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.594330 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smml7\" (UniqueName: \"kubernetes.io/projected/a1577bbb-ede6-49be-b05e-09e35194cde6-kube-api-access-smml7\") pod \"a1577bbb-ede6-49be-b05e-09e35194cde6\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.594359 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-utilities\") pod \"a1577bbb-ede6-49be-b05e-09e35194cde6\" (UID: \"a1577bbb-ede6-49be-b05e-09e35194cde6\") " Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.595627 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-utilities" (OuterVolumeSpecName: "utilities") pod "a1577bbb-ede6-49be-b05e-09e35194cde6" (UID: "a1577bbb-ede6-49be-b05e-09e35194cde6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.601504 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1577bbb-ede6-49be-b05e-09e35194cde6-kube-api-access-smml7" (OuterVolumeSpecName: "kube-api-access-smml7") pod "a1577bbb-ede6-49be-b05e-09e35194cde6" (UID: "a1577bbb-ede6-49be-b05e-09e35194cde6"). InnerVolumeSpecName "kube-api-access-smml7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.696989 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smml7\" (UniqueName: \"kubernetes.io/projected/a1577bbb-ede6-49be-b05e-09e35194cde6-kube-api-access-smml7\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.697047 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.741724 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xtss"] Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.742333 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4xtss" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerName="registry-server" containerID="cri-o://b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c" gracePeriod=2 Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.812260 4772 generic.go:334] "Generic (PLEG): container finished" podID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerID="50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365" exitCode=0 Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.812363 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gn4wh" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.812386 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn4wh" event={"ID":"a1577bbb-ede6-49be-b05e-09e35194cde6","Type":"ContainerDied","Data":"50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365"} Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.812460 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn4wh" event={"ID":"a1577bbb-ede6-49be-b05e-09e35194cde6","Type":"ContainerDied","Data":"e1fe596b9108c853f455bc42351d7ff0403aeb050b1b5b603e200154658797a9"} Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.812481 4772 scope.go:117] "RemoveContainer" containerID="50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.832932 4772 scope.go:117] "RemoveContainer" containerID="a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.848691 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1577bbb-ede6-49be-b05e-09e35194cde6" (UID: "a1577bbb-ede6-49be-b05e-09e35194cde6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.849447 4772 scope.go:117] "RemoveContainer" containerID="bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.884858 4772 scope.go:117] "RemoveContainer" containerID="50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365" Sep 30 17:05:23 crc kubenswrapper[4772]: E0930 17:05:23.885361 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365\": container with ID starting with 50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365 not found: ID does not exist" containerID="50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.885405 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365"} err="failed to get container status \"50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365\": rpc error: code = NotFound desc = could not find container \"50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365\": container with ID starting with 50868d6061c4f1758df8c9b0ad2e07cdb9225dde3427d608d7e5144ef3ae1365 not found: ID does not exist" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.885429 4772 scope.go:117] "RemoveContainer" containerID="a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240" Sep 30 17:05:23 crc kubenswrapper[4772]: E0930 17:05:23.885730 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240\": container with ID starting with a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240 not found: ID does not exist" containerID="a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.885763 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240"} err="failed to get container status \"a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240\": rpc error: code = NotFound desc = could not find container \"a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240\": container with ID starting with a674f31771e8ae043712d7961f79a79407466afe9858544312310e4e1bc0b240 not found: ID does not exist" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.885779 4772 scope.go:117] "RemoveContainer" containerID="bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545" Sep 30 17:05:23 crc kubenswrapper[4772]: E0930 17:05:23.886091 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545\": container with ID starting with bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545 not found: ID does not exist" containerID="bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.886114 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545"} err="failed to get container status \"bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545\": rpc error: code = NotFound desc = could not find container \"bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545\": container with ID starting with bb2eb7b44ce09d8a442e80587c45859c2b4b94839cc0ad8b35e418067b0ca545 not found: ID does not exist" Sep 30 17:05:23 crc kubenswrapper[4772]: I0930 17:05:23.900127 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1577bbb-ede6-49be-b05e-09e35194cde6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.132837 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gn4wh"] Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.139231 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gn4wh"] Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.598111 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.719867 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-catalog-content\") pod \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.719942 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz9tw\" (UniqueName: \"kubernetes.io/projected/34d82af0-e8df-49bb-bc8b-372ed51d7d53-kube-api-access-hz9tw\") pod \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.720035 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-utilities\") pod \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\" (UID: \"34d82af0-e8df-49bb-bc8b-372ed51d7d53\") " Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.721172 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-utilities" (OuterVolumeSpecName: "utilities") pod "34d82af0-e8df-49bb-bc8b-372ed51d7d53" (UID: "34d82af0-e8df-49bb-bc8b-372ed51d7d53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.725612 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d82af0-e8df-49bb-bc8b-372ed51d7d53-kube-api-access-hz9tw" (OuterVolumeSpecName: "kube-api-access-hz9tw") pod "34d82af0-e8df-49bb-bc8b-372ed51d7d53" (UID: "34d82af0-e8df-49bb-bc8b-372ed51d7d53"). InnerVolumeSpecName "kube-api-access-hz9tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.822285 4772 generic.go:334] "Generic (PLEG): container finished" podID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerID="b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c" exitCode=0 Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.822343 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xtss" event={"ID":"34d82af0-e8df-49bb-bc8b-372ed51d7d53","Type":"ContainerDied","Data":"b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c"} Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.822379 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xtss" event={"ID":"34d82af0-e8df-49bb-bc8b-372ed51d7d53","Type":"ContainerDied","Data":"5d78b606977d1fef452b662772eed626924bea7ae3f9431be5e18b3180c8e77e"} Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.822401 4772 scope.go:117] "RemoveContainer" containerID="b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.822589 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xtss" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.823029 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.823087 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz9tw\" (UniqueName: \"kubernetes.io/projected/34d82af0-e8df-49bb-bc8b-372ed51d7d53-kube-api-access-hz9tw\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.842626 4772 scope.go:117] "RemoveContainer" containerID="d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.862368 4772 scope.go:117] "RemoveContainer" containerID="0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.882976 4772 scope.go:117] "RemoveContainer" containerID="b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c" Sep 30 17:05:24 crc kubenswrapper[4772]: E0930 17:05:24.884617 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c\": container with ID starting with b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c not found: ID does not exist" containerID="b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.884663 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c"} err="failed to get container status \"b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c\": rpc error: code = NotFound desc = could not find container \"b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c\": container with ID starting with b97ec20f110fd7edb7846eb34cd88ebd9585e4976158cd3716f9f177dfb19f1c not found: ID does not exist" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.884710 4772 scope.go:117] "RemoveContainer" containerID="d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606" Sep 30 17:05:24 crc kubenswrapper[4772]: E0930 17:05:24.885645 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606\": container with ID starting with d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606 not found: ID does not exist" containerID="d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.885713 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606"} err="failed to get container status \"d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606\": rpc error: code = NotFound desc = could not find container \"d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606\": container with ID starting with d0ddde0d8824b9b0a3c739fc7a9797f4de7a69230bebe03277921c7a63ef3606 not found: ID does not exist" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.885759 4772 scope.go:117] "RemoveContainer" containerID="0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d" Sep 30 17:05:24 crc kubenswrapper[4772]: E0930 17:05:24.886326 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d\": container with ID starting with 0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d not found: ID does not exist" containerID="0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d" Sep 30 17:05:24 crc kubenswrapper[4772]: I0930 17:05:24.886359 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d"} err="failed to get container status \"0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d\": rpc error: code = NotFound desc = could not find container \"0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d\": container with ID starting with 0d6e1cdfc1ab3e8115abc7259cdeeaee611529aa408be296975831c266563f3d not found: ID does not exist" Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.074738 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34d82af0-e8df-49bb-bc8b-372ed51d7d53" (UID: "34d82af0-e8df-49bb-bc8b-372ed51d7d53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.127182 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d82af0-e8df-49bb-bc8b-372ed51d7d53-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.156239 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xtss"] Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.161002 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4xtss"] Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.543463 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm9fz"] Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.543681 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pm9fz" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerName="registry-server" containerID="cri-o://21b51b2bcd0279c98a79bc482793b829c78919539c4dfdd07df8ff344a13f444" gracePeriod=2 Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.830645 4772 generic.go:334] "Generic (PLEG): container finished" podID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerID="21b51b2bcd0279c98a79bc482793b829c78919539c4dfdd07df8ff344a13f444" exitCode=0 Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.830715 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm9fz" event={"ID":"ca089976-6fee-4d9a-9e3e-48a61b6de5f1","Type":"ContainerDied","Data":"21b51b2bcd0279c98a79bc482793b829c78919539c4dfdd07df8ff344a13f444"} Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.873984 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.908527 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" path="/var/lib/kubelet/pods/34d82af0-e8df-49bb-bc8b-372ed51d7d53/volumes" Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.909291 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" path="/var/lib/kubelet/pods/a1577bbb-ede6-49be-b05e-09e35194cde6/volumes" Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.938913 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnq9p\" (UniqueName: \"kubernetes.io/projected/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-kube-api-access-tnq9p\") pod \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.939009 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-catalog-content\") pod \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.939042 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-utilities\") pod \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\" (UID: \"ca089976-6fee-4d9a-9e3e-48a61b6de5f1\") " Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.940541 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-utilities" (OuterVolumeSpecName: "utilities") pod "ca089976-6fee-4d9a-9e3e-48a61b6de5f1" (UID: "ca089976-6fee-4d9a-9e3e-48a61b6de5f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.946269 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-kube-api-access-tnq9p" (OuterVolumeSpecName: "kube-api-access-tnq9p") pod "ca089976-6fee-4d9a-9e3e-48a61b6de5f1" (UID: "ca089976-6fee-4d9a-9e3e-48a61b6de5f1"). InnerVolumeSpecName "kube-api-access-tnq9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:05:25 crc kubenswrapper[4772]: I0930 17:05:25.952775 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca089976-6fee-4d9a-9e3e-48a61b6de5f1" (UID: "ca089976-6fee-4d9a-9e3e-48a61b6de5f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:05:26 crc kubenswrapper[4772]: I0930 17:05:26.040777 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:26 crc kubenswrapper[4772]: I0930 17:05:26.040833 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:26 crc kubenswrapper[4772]: I0930 17:05:26.040846 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnq9p\" (UniqueName: \"kubernetes.io/projected/ca089976-6fee-4d9a-9e3e-48a61b6de5f1-kube-api-access-tnq9p\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:26 crc kubenswrapper[4772]: I0930 17:05:26.838578 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm9fz" event={"ID":"ca089976-6fee-4d9a-9e3e-48a61b6de5f1","Type":"ContainerDied","Data":"19f860a20674e3c70af12ab3d84768cb0cb47a5d773c5afe05d2c84f3a3d2615"} Sep 30 17:05:26 crc kubenswrapper[4772]: I0930 17:05:26.838650 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm9fz" Sep 30 17:05:26 crc kubenswrapper[4772]: I0930 17:05:26.838913 4772 scope.go:117] "RemoveContainer" containerID="21b51b2bcd0279c98a79bc482793b829c78919539c4dfdd07df8ff344a13f444" Sep 30 17:05:26 crc kubenswrapper[4772]: I0930 17:05:26.862834 4772 scope.go:117] "RemoveContainer" containerID="dcb43f4b2c0fa1bd7d6ca67ab44973202cd880bb26b302ce399aa4b38139d7e5" Sep 30 17:05:26 crc kubenswrapper[4772]: I0930 17:05:26.885579 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm9fz"] Sep 30 17:05:26 crc kubenswrapper[4772]: I0930 17:05:26.886048 4772 scope.go:117] "RemoveContainer" containerID="8b51889a237490eaa75e7bd403ede266dfb23f8c3483e3555d9cba215c78df32" Sep 30 17:05:26 crc kubenswrapper[4772]: I0930 17:05:26.887984 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm9fz"] Sep 30 17:05:27 crc kubenswrapper[4772]: I0930 17:05:27.904601 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" path="/var/lib/kubelet/pods/ca089976-6fee-4d9a-9e3e-48a61b6de5f1/volumes" Sep 30 17:05:30 crc kubenswrapper[4772]: I0930 17:05:30.897942 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r4gqw"] Sep 30 17:05:56 crc kubenswrapper[4772]: I0930 17:05:56.890949 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" podUID="63eeced4-9c90-46e5-9234-938f88df7c49" containerName="oauth-openshift" containerID="cri-o://a4448a5e0816ebb3aaea0e8bf6a247326e27cba12796f5884a432e9e8c636204" gracePeriod=15 Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.018072 4772 generic.go:334] "Generic (PLEG): container finished" podID="63eeced4-9c90-46e5-9234-938f88df7c49" containerID="a4448a5e0816ebb3aaea0e8bf6a247326e27cba12796f5884a432e9e8c636204" exitCode=0 Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.018115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" event={"ID":"63eeced4-9c90-46e5-9234-938f88df7c49","Type":"ContainerDied","Data":"a4448a5e0816ebb3aaea0e8bf6a247326e27cba12796f5884a432e9e8c636204"} Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.244503 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.280388 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-bd7987fd5-xlwvx"] Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281309 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerName="extract-content" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281333 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerName="extract-content" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281356 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerName="extract-utilities" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281366 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerName="extract-utilities" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281380 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerName="extract-content" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281387 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerName="extract-content" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281400 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerName="extract-content" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281407 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerName="extract-content" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281418 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63eeced4-9c90-46e5-9234-938f88df7c49" containerName="oauth-openshift" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281426 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="63eeced4-9c90-46e5-9234-938f88df7c49" containerName="oauth-openshift" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281434 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281442 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281454 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281461 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281473 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerName="extract-utilities" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281480 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerName="extract-utilities" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281493 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281501 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281509 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a68e96d-d547-4060-8ab8-c693324a4423" containerName="collect-profiles" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281517 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a68e96d-d547-4060-8ab8-c693324a4423" containerName="collect-profiles" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281528 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerName="extract-utilities" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281537 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerName="extract-utilities" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281548 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281555 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281567 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eaebe8c-bc5b-49e6-8505-1cc00258b9c0" containerName="pruner" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281574 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eaebe8c-bc5b-49e6-8505-1cc00258b9c0" containerName="pruner" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281586 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerName="extract-utilities" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281593 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerName="extract-utilities" Sep 30 17:05:57 crc kubenswrapper[4772]: E0930 17:05:57.281603 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerName="extract-content" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281610 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerName="extract-content" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281736 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="63eeced4-9c90-46e5-9234-938f88df7c49" containerName="oauth-openshift" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281750 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca089976-6fee-4d9a-9e3e-48a61b6de5f1" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281760 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a68e96d-d547-4060-8ab8-c693324a4423" containerName="collect-profiles" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281774 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eaebe8c-bc5b-49e6-8505-1cc00258b9c0" containerName="pruner" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281782 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1577bbb-ede6-49be-b05e-09e35194cde6" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281789 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d82af0-e8df-49bb-bc8b-372ed51d7d53" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.281799 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b62f9b-60d9-4421-860e-72eb21c7aab4" containerName="registry-server" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.282263 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.307500 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bd7987fd5-xlwvx"] Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.381038 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-session\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.381199 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-idp-0-file-data\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.381258 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/63eeced4-9c90-46e5-9234-938f88df7c49-audit-dir\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.381312 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-router-certs\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.381367 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-audit-policies\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.381430 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63eeced4-9c90-46e5-9234-938f88df7c49-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.381512 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-trusted-ca-bundle\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.381575 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r8df\" (UniqueName: \"kubernetes.io/projected/63eeced4-9c90-46e5-9234-938f88df7c49-kube-api-access-2r8df\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.383183 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-service-ca\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.382731 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.382910 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.383228 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-error\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.383424 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-provider-selection\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.383481 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-login\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.383521 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-serving-cert\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.383550 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-ocp-branding-template\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.383615 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-cliconfig\") pod \"63eeced4-9c90-46e5-9234-938f88df7c49\" (UID: \"63eeced4-9c90-46e5-9234-938f88df7c49\") " Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.383917 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-router-certs\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384032 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-audit-policies\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384129 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384186 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384210 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384255 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-audit-dir\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384295 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2w47\" (UniqueName: \"kubernetes.io/projected/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-kube-api-access-r2w47\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384313 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384342 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-service-ca\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384584 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-template-error\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384634 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-template-login\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384615 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384734 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384774 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-session\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384808 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384841 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384920 4772 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/63eeced4-9c90-46e5-9234-938f88df7c49-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384942 4772 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384961 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384975 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.384990 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.390669 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63eeced4-9c90-46e5-9234-938f88df7c49-kube-api-access-2r8df" (OuterVolumeSpecName: "kube-api-access-2r8df") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "kube-api-access-2r8df". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.392343 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.393039 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.393279 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.394042 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.394312 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.394660 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.394828 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.395278 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "63eeced4-9c90-46e5-9234-938f88df7c49" (UID: "63eeced4-9c90-46e5-9234-938f88df7c49"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486592 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-service-ca\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486664 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-template-error\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486700 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-template-login\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486762 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486798 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-session\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486824 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486857 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486887 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-router-certs\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486928 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-audit-policies\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486966 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.486994 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487023 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487050 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-audit-dir\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487102 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2w47\" (UniqueName: \"kubernetes.io/projected/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-kube-api-access-r2w47\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487163 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r8df\" (UniqueName: \"kubernetes.io/projected/63eeced4-9c90-46e5-9234-938f88df7c49-kube-api-access-2r8df\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487178 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487196 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487214 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487228 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487242 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487256 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487273 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487287 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/63eeced4-9c90-46e5-9234-938f88df7c49-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.487812 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-service-ca\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.488641 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.489215 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-audit-dir\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.489554 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-audit-policies\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.489608 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.490679 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-session\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.491451 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-template-error\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.491769 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-template-login\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.492671 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.492822 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.492997 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.493880 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.494516 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-v4-0-config-system-router-certs\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.506807 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2w47\" (UniqueName: \"kubernetes.io/projected/da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942-kube-api-access-r2w47\") pod \"oauth-openshift-bd7987fd5-xlwvx\" (UID: \"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942\") " pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.609382 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:57 crc kubenswrapper[4772]: I0930 17:05:57.827964 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bd7987fd5-xlwvx"] Sep 30 17:05:58 crc kubenswrapper[4772]: I0930 17:05:58.024155 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" event={"ID":"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942","Type":"ContainerStarted","Data":"51f9136c24a12d7eadeab0f64cd872e2eef3d4bcac3fb6594109a7ee9f932fac"} Sep 30 17:05:58 crc kubenswrapper[4772]: I0930 17:05:58.026023 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" event={"ID":"63eeced4-9c90-46e5-9234-938f88df7c49","Type":"ContainerDied","Data":"87adc10635ca6554c206b32f3f275ecd982fd5fcd3476fc9a227f85df971254e"} Sep 30 17:05:58 crc kubenswrapper[4772]: I0930 17:05:58.026102 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r4gqw" Sep 30 17:05:58 crc kubenswrapper[4772]: I0930 17:05:58.026106 4772 scope.go:117] "RemoveContainer" containerID="a4448a5e0816ebb3aaea0e8bf6a247326e27cba12796f5884a432e9e8c636204" Sep 30 17:05:58 crc kubenswrapper[4772]: I0930 17:05:58.084195 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r4gqw"] Sep 30 17:05:58 crc kubenswrapper[4772]: I0930 17:05:58.086400 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r4gqw"] Sep 30 17:05:59 crc kubenswrapper[4772]: I0930 17:05:59.038597 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" event={"ID":"da6aa7f7-20a1-4aa8-9ca0-961c7e0a7942","Type":"ContainerStarted","Data":"5962de95e84622b72f12fb54299e00f61633e46462914d6b4e42bd4f67d556e6"} Sep 30 17:05:59 crc kubenswrapper[4772]: I0930 17:05:59.038809 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:59 crc kubenswrapper[4772]: I0930 17:05:59.043930 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" Sep 30 17:05:59 crc kubenswrapper[4772]: I0930 17:05:59.065037 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-bd7987fd5-xlwvx" podStartSLOduration=29.064983992 podStartE2EDuration="29.064983992s" podCreationTimestamp="2025-09-30 17:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:05:59.062134198 +0000 UTC m=+259.969147049" watchObservedRunningTime="2025-09-30 17:05:59.064983992 +0000 UTC m=+259.971996843" Sep 30 17:05:59 crc kubenswrapper[4772]: I0930 17:05:59.904890 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63eeced4-9c90-46e5-9234-938f88df7c49" path="/var/lib/kubelet/pods/63eeced4-9c90-46e5-9234-938f88df7c49/volumes" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.479538 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6j79n"] Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.481212 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6j79n" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerName="registry-server" containerID="cri-o://4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8" gracePeriod=30 Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.482187 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ddnkh"] Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.482337 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ddnkh" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerName="registry-server" containerID="cri-o://2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506" gracePeriod=30 Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.503847 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tbscw"] Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.504088 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" podUID="6e905513-23f6-4e8f-95df-0668beaad53d" containerName="marketplace-operator" containerID="cri-o://2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276" gracePeriod=30 Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.519611 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-znf69"] Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.519862 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-znf69" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerName="registry-server" containerID="cri-o://eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c" gracePeriod=30 Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.524641 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8qmk7"] Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.525335 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.532387 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ntkh6"] Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.532592 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ntkh6" podUID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerName="registry-server" containerID="cri-o://31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710" gracePeriod=30 Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.562127 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8qmk7"] Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.568081 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae4ed0a-da1e-4581-913e-1c3c8c1554cc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8qmk7\" (UID: \"aae4ed0a-da1e-4581-913e-1c3c8c1554cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.568234 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbbs7\" (UniqueName: \"kubernetes.io/projected/aae4ed0a-da1e-4581-913e-1c3c8c1554cc-kube-api-access-hbbs7\") pod \"marketplace-operator-79b997595-8qmk7\" (UID: \"aae4ed0a-da1e-4581-913e-1c3c8c1554cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.568308 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aae4ed0a-da1e-4581-913e-1c3c8c1554cc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8qmk7\" (UID: \"aae4ed0a-da1e-4581-913e-1c3c8c1554cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: E0930 17:06:09.623154 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506 is running failed: container process not found" containerID="2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:06:09 crc kubenswrapper[4772]: E0930 17:06:09.623942 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506 is running failed: container process not found" containerID="2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:06:09 crc kubenswrapper[4772]: E0930 17:06:09.627432 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506 is running failed: container process not found" containerID="2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:06:09 crc kubenswrapper[4772]: E0930 17:06:09.627524 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-ddnkh" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerName="registry-server" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.669745 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbs7\" (UniqueName: \"kubernetes.io/projected/aae4ed0a-da1e-4581-913e-1c3c8c1554cc-kube-api-access-hbbs7\") pod \"marketplace-operator-79b997595-8qmk7\" (UID: \"aae4ed0a-da1e-4581-913e-1c3c8c1554cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.669788 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aae4ed0a-da1e-4581-913e-1c3c8c1554cc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8qmk7\" (UID: \"aae4ed0a-da1e-4581-913e-1c3c8c1554cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.669900 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae4ed0a-da1e-4581-913e-1c3c8c1554cc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8qmk7\" (UID: \"aae4ed0a-da1e-4581-913e-1c3c8c1554cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.671038 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae4ed0a-da1e-4581-913e-1c3c8c1554cc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8qmk7\" (UID: \"aae4ed0a-da1e-4581-913e-1c3c8c1554cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.676789 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aae4ed0a-da1e-4581-913e-1c3c8c1554cc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8qmk7\" (UID: \"aae4ed0a-da1e-4581-913e-1c3c8c1554cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.689791 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbbs7\" (UniqueName: \"kubernetes.io/projected/aae4ed0a-da1e-4581-913e-1c3c8c1554cc-kube-api-access-hbbs7\") pod \"marketplace-operator-79b997595-8qmk7\" (UID: \"aae4ed0a-da1e-4581-913e-1c3c8c1554cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: E0930 17:06:09.827369 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8 is running failed: container process not found" containerID="4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:06:09 crc kubenswrapper[4772]: E0930 17:06:09.828433 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8 is running failed: container process not found" containerID="4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:06:09 crc kubenswrapper[4772]: E0930 17:06:09.829284 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8 is running failed: container process not found" containerID="4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:06:09 crc kubenswrapper[4772]: E0930 17:06:09.829372 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-6j79n" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerName="registry-server" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.937626 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.948756 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.994788 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-utilities\") pod \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.994892 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-catalog-content\") pod \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.994978 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wng64\" (UniqueName: \"kubernetes.io/projected/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-kube-api-access-wng64\") pod \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\" (UID: \"4ee62041-3f62-4daa-b7cc-9b8ec568bc61\") " Sep 30 17:06:09 crc kubenswrapper[4772]: I0930 17:06:09.996085 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-utilities" (OuterVolumeSpecName: "utilities") pod "4ee62041-3f62-4daa-b7cc-9b8ec568bc61" (UID: "4ee62041-3f62-4daa-b7cc-9b8ec568bc61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.000325 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-kube-api-access-wng64" (OuterVolumeSpecName: "kube-api-access-wng64") pod "4ee62041-3f62-4daa-b7cc-9b8ec568bc61" (UID: "4ee62041-3f62-4daa-b7cc-9b8ec568bc61"). InnerVolumeSpecName "kube-api-access-wng64". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.057646 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.060025 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.096794 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-catalog-content\") pod \"8f701b8e-c15e-48f0-a732-fba005c98ff7\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.096869 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jwrh\" (UniqueName: \"kubernetes.io/projected/8f701b8e-c15e-48f0-a732-fba005c98ff7-kube-api-access-2jwrh\") pod \"8f701b8e-c15e-48f0-a732-fba005c98ff7\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.096947 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mfrk\" (UniqueName: \"kubernetes.io/projected/35d2ea14-6885-4373-b795-4e4714b4a2ff-kube-api-access-2mfrk\") pod \"35d2ea14-6885-4373-b795-4e4714b4a2ff\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.097006 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-utilities\") pod \"35d2ea14-6885-4373-b795-4e4714b4a2ff\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.097035 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-utilities\") pod \"8f701b8e-c15e-48f0-a732-fba005c98ff7\" (UID: \"8f701b8e-c15e-48f0-a732-fba005c98ff7\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.097117 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-catalog-content\") pod \"35d2ea14-6885-4373-b795-4e4714b4a2ff\" (UID: \"35d2ea14-6885-4373-b795-4e4714b4a2ff\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.097358 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wng64\" (UniqueName: \"kubernetes.io/projected/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-kube-api-access-wng64\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.097376 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.097874 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-utilities" (OuterVolumeSpecName: "utilities") pod "35d2ea14-6885-4373-b795-4e4714b4a2ff" (UID: "35d2ea14-6885-4373-b795-4e4714b4a2ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.098440 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-utilities" (OuterVolumeSpecName: "utilities") pod "8f701b8e-c15e-48f0-a732-fba005c98ff7" (UID: "8f701b8e-c15e-48f0-a732-fba005c98ff7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.102164 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.109369 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d2ea14-6885-4373-b795-4e4714b4a2ff-kube-api-access-2mfrk" (OuterVolumeSpecName: "kube-api-access-2mfrk") pod "35d2ea14-6885-4373-b795-4e4714b4a2ff" (UID: "35d2ea14-6885-4373-b795-4e4714b4a2ff"). InnerVolumeSpecName "kube-api-access-2mfrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.109443 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f701b8e-c15e-48f0-a732-fba005c98ff7-kube-api-access-2jwrh" (OuterVolumeSpecName: "kube-api-access-2jwrh") pod "8f701b8e-c15e-48f0-a732-fba005c98ff7" (UID: "8f701b8e-c15e-48f0-a732-fba005c98ff7"). InnerVolumeSpecName "kube-api-access-2jwrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.111808 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.123593 4772 generic.go:334] "Generic (PLEG): container finished" podID="6e905513-23f6-4e8f-95df-0668beaad53d" containerID="2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276" exitCode=0 Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.123763 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.123996 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" event={"ID":"6e905513-23f6-4e8f-95df-0668beaad53d","Type":"ContainerDied","Data":"2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276"} Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.124069 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tbscw" event={"ID":"6e905513-23f6-4e8f-95df-0668beaad53d","Type":"ContainerDied","Data":"d9872df9fc19ca594f389f675f377249f13fd71c9350456f503b1ca3555eb295"} Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.124103 4772 scope.go:117] "RemoveContainer" containerID="2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.130445 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ee62041-3f62-4daa-b7cc-9b8ec568bc61" (UID: "4ee62041-3f62-4daa-b7cc-9b8ec568bc61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.134440 4772 generic.go:334] "Generic (PLEG): container finished" podID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerID="2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506" exitCode=0 Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.134582 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddnkh" event={"ID":"35d2ea14-6885-4373-b795-4e4714b4a2ff","Type":"ContainerDied","Data":"2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506"} Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.134624 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddnkh" event={"ID":"35d2ea14-6885-4373-b795-4e4714b4a2ff","Type":"ContainerDied","Data":"b30b3f83b12646edef5a8b945323789934101d771152dfceb68f3aa267b696ba"} Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.135837 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ddnkh" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.160720 4772 generic.go:334] "Generic (PLEG): container finished" podID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerID="31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710" exitCode=0 Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.160814 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntkh6" event={"ID":"4ee62041-3f62-4daa-b7cc-9b8ec568bc61","Type":"ContainerDied","Data":"31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710"} Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.160867 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntkh6" event={"ID":"4ee62041-3f62-4daa-b7cc-9b8ec568bc61","Type":"ContainerDied","Data":"465d828d30ab266f47fe0a548402fbc4200fc0b6ad158c52001cc4fcc6217d51"} Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.160828 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntkh6" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.178925 4772 generic.go:334] "Generic (PLEG): container finished" podID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerID="eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c" exitCode=0 Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.179270 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-znf69" event={"ID":"cc49ca3c-00ac-47d2-abd4-74436beb8c45","Type":"ContainerDied","Data":"eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c"} Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.179383 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-znf69" event={"ID":"cc49ca3c-00ac-47d2-abd4-74436beb8c45","Type":"ContainerDied","Data":"c844f25dd285073b3d583e9c4b1e2a75f812581033b8fe64f5171c6bd4ed8c5f"} Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.179517 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-znf69" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.194857 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ntkh6"] Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.195393 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ntkh6"] Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.200313 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-catalog-content\") pod \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.200765 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-operator-metrics\") pod \"6e905513-23f6-4e8f-95df-0668beaad53d\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.200865 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8zwb\" (UniqueName: \"kubernetes.io/projected/6e905513-23f6-4e8f-95df-0668beaad53d-kube-api-access-d8zwb\") pod \"6e905513-23f6-4e8f-95df-0668beaad53d\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.200934 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-utilities\") pod \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201145 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx42k\" (UniqueName: \"kubernetes.io/projected/cc49ca3c-00ac-47d2-abd4-74436beb8c45-kube-api-access-sx42k\") pod \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\" (UID: \"cc49ca3c-00ac-47d2-abd4-74436beb8c45\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201261 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-trusted-ca\") pod \"6e905513-23f6-4e8f-95df-0668beaad53d\" (UID: \"6e905513-23f6-4e8f-95df-0668beaad53d\") " Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201359 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35d2ea14-6885-4373-b795-4e4714b4a2ff" (UID: "35d2ea14-6885-4373-b795-4e4714b4a2ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201641 4772 generic.go:334] "Generic (PLEG): container finished" podID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerID="4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8" exitCode=0 Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201659 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jwrh\" (UniqueName: \"kubernetes.io/projected/8f701b8e-c15e-48f0-a732-fba005c98ff7-kube-api-access-2jwrh\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201685 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j79n" event={"ID":"8f701b8e-c15e-48f0-a732-fba005c98ff7","Type":"ContainerDied","Data":"4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8"} Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201715 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j79n" event={"ID":"8f701b8e-c15e-48f0-a732-fba005c98ff7","Type":"ContainerDied","Data":"4dcaf2cad859efc7378f7394a4212a3d7eda25792556ab835d44e57cba4b7a1f"} Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201693 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mfrk\" (UniqueName: \"kubernetes.io/projected/35d2ea14-6885-4373-b795-4e4714b4a2ff-kube-api-access-2mfrk\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201756 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ee62041-3f62-4daa-b7cc-9b8ec568bc61-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201769 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201783 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.201794 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d2ea14-6885-4373-b795-4e4714b4a2ff-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.202172 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6j79n" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.202417 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-utilities" (OuterVolumeSpecName: "utilities") pod "cc49ca3c-00ac-47d2-abd4-74436beb8c45" (UID: "cc49ca3c-00ac-47d2-abd4-74436beb8c45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.203911 4772 scope.go:117] "RemoveContainer" containerID="2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.205336 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6e905513-23f6-4e8f-95df-0668beaad53d" (UID: "6e905513-23f6-4e8f-95df-0668beaad53d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.207561 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276\": container with ID starting with 2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276 not found: ID does not exist" containerID="2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.207626 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276"} err="failed to get container status \"2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276\": rpc error: code = NotFound desc = could not find container \"2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276\": container with ID starting with 2042351b9b45f136e11ce51f4271df917d34c678162f3fdd6920b914405c4276 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.207697 4772 scope.go:117] "RemoveContainer" containerID="2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.208432 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc49ca3c-00ac-47d2-abd4-74436beb8c45-kube-api-access-sx42k" (OuterVolumeSpecName: "kube-api-access-sx42k") pod "cc49ca3c-00ac-47d2-abd4-74436beb8c45" (UID: "cc49ca3c-00ac-47d2-abd4-74436beb8c45"). InnerVolumeSpecName "kube-api-access-sx42k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.208729 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6e905513-23f6-4e8f-95df-0668beaad53d" (UID: "6e905513-23f6-4e8f-95df-0668beaad53d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.208934 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e905513-23f6-4e8f-95df-0668beaad53d-kube-api-access-d8zwb" (OuterVolumeSpecName: "kube-api-access-d8zwb") pod "6e905513-23f6-4e8f-95df-0668beaad53d" (UID: "6e905513-23f6-4e8f-95df-0668beaad53d"). InnerVolumeSpecName "kube-api-access-d8zwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.224660 4772 scope.go:117] "RemoveContainer" containerID="1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.232140 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc49ca3c-00ac-47d2-abd4-74436beb8c45" (UID: "cc49ca3c-00ac-47d2-abd4-74436beb8c45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.241472 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f701b8e-c15e-48f0-a732-fba005c98ff7" (UID: "8f701b8e-c15e-48f0-a732-fba005c98ff7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.243015 4772 scope.go:117] "RemoveContainer" containerID="3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.255041 4772 scope.go:117] "RemoveContainer" containerID="2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.255534 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506\": container with ID starting with 2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506 not found: ID does not exist" containerID="2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.255592 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506"} err="failed to get container status \"2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506\": rpc error: code = NotFound desc = could not find container \"2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506\": container with ID starting with 2a3bf66a2d68ba66cd17fe2ad324c0f3f9b097f77d5cc10f54e28d82aa378506 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.255633 4772 scope.go:117] "RemoveContainer" containerID="1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.255887 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40\": container with ID starting with 1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40 not found: ID does not exist" containerID="1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.255920 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40"} err="failed to get container status \"1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40\": rpc error: code = NotFound desc = could not find container \"1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40\": container with ID starting with 1bff51d24279527986d6e85d24433baf79912ad8e7e8dfe613b84345a88c0d40 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.255943 4772 scope.go:117] "RemoveContainer" containerID="3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.256240 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607\": container with ID starting with 3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607 not found: ID does not exist" containerID="3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.256269 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607"} err="failed to get container status \"3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607\": rpc error: code = NotFound desc = could not find container \"3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607\": container with ID starting with 3eb1db783d86bfa9211a1741ca7371b7a9f61c5dbaa33f38631422b11bfa1607 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.256286 4772 scope.go:117] "RemoveContainer" containerID="31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.272014 4772 scope.go:117] "RemoveContainer" containerID="107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.289373 4772 scope.go:117] "RemoveContainer" containerID="8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.303044 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx42k\" (UniqueName: \"kubernetes.io/projected/cc49ca3c-00ac-47d2-abd4-74436beb8c45-kube-api-access-sx42k\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.303104 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.303118 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.303130 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f701b8e-c15e-48f0-a732-fba005c98ff7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.303145 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e905513-23f6-4e8f-95df-0668beaad53d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.303157 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8zwb\" (UniqueName: \"kubernetes.io/projected/6e905513-23f6-4e8f-95df-0668beaad53d-kube-api-access-d8zwb\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.303169 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc49ca3c-00ac-47d2-abd4-74436beb8c45-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.306637 4772 scope.go:117] "RemoveContainer" containerID="31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.307242 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710\": container with ID starting with 31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710 not found: ID does not exist" containerID="31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.307299 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710"} err="failed to get container status \"31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710\": rpc error: code = NotFound desc = could not find container \"31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710\": container with ID starting with 31f5cb2683a308f73b0504bf84c3ee981cd929e446a192e4a9ad42b156c50710 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.307342 4772 scope.go:117] "RemoveContainer" containerID="107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.307870 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f\": container with ID starting with 107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f not found: ID does not exist" containerID="107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.307912 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f"} err="failed to get container status \"107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f\": rpc error: code = NotFound desc = could not find container \"107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f\": container with ID starting with 107cd3843131c426697f1c56523e6a36edca7cb40386c4c570dec73a543db49f not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.307944 4772 scope.go:117] "RemoveContainer" containerID="8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.308261 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0\": container with ID starting with 8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0 not found: ID does not exist" containerID="8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.308298 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0"} err="failed to get container status \"8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0\": rpc error: code = NotFound desc = could not find container \"8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0\": container with ID starting with 8cf93edc0153dcb2c3bce61d9aba4fcb73128d6acb4946349a96162be01e94e0 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.308318 4772 scope.go:117] "RemoveContainer" containerID="eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.322147 4772 scope.go:117] "RemoveContainer" containerID="6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.333941 4772 scope.go:117] "RemoveContainer" containerID="89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.349844 4772 scope.go:117] "RemoveContainer" containerID="eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.350301 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c\": container with ID starting with eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c not found: ID does not exist" containerID="eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.350345 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c"} err="failed to get container status \"eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c\": rpc error: code = NotFound desc = could not find container \"eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c\": container with ID starting with eadd02a10414e992d1f8a1a42573ffb4b49729ca607b96164a722a815ea4b73c not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.350380 4772 scope.go:117] "RemoveContainer" containerID="6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.350801 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1\": container with ID starting with 6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1 not found: ID does not exist" containerID="6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.350845 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1"} err="failed to get container status \"6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1\": rpc error: code = NotFound desc = could not find container \"6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1\": container with ID starting with 6f62e97d810f1d9fc7533a0ff658fd171d736381979cb58e43f5835650098db1 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.350875 4772 scope.go:117] "RemoveContainer" containerID="89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.351429 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15\": container with ID starting with 89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15 not found: ID does not exist" containerID="89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.351474 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15"} err="failed to get container status \"89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15\": rpc error: code = NotFound desc = could not find container \"89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15\": container with ID starting with 89a3a69605c528afbe408f80a5ec3c9c11f6899c992f37b058d97bd1e0facc15 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.351507 4772 scope.go:117] "RemoveContainer" containerID="4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.371045 4772 scope.go:117] "RemoveContainer" containerID="3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.384950 4772 scope.go:117] "RemoveContainer" containerID="12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.396703 4772 scope.go:117] "RemoveContainer" containerID="4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.397232 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8\": container with ID starting with 4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8 not found: ID does not exist" containerID="4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.397267 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8"} err="failed to get container status \"4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8\": rpc error: code = NotFound desc = could not find container \"4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8\": container with ID starting with 4ccff47a85e76f5d3ff2cab893451174687c010cf51f458341d39724fa2b9fc8 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.397292 4772 scope.go:117] "RemoveContainer" containerID="3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.397624 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435\": container with ID starting with 3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435 not found: ID does not exist" containerID="3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.397647 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435"} err="failed to get container status \"3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435\": rpc error: code = NotFound desc = could not find container \"3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435\": container with ID starting with 3737ce4bf5864d187096794aa216df88c65452678728e41a5f66335c2740b435 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.397662 4772 scope.go:117] "RemoveContainer" containerID="12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5" Sep 30 17:06:10 crc kubenswrapper[4772]: E0930 17:06:10.397935 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5\": container with ID starting with 12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5 not found: ID does not exist" containerID="12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.397976 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5"} err="failed to get container status \"12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5\": rpc error: code = NotFound desc = could not find container \"12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5\": container with ID starting with 12038b971edeb750a1e1c9bce1126a87175d289f2dd76da84be1b4e0205140d5 not found: ID does not exist" Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.473046 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tbscw"] Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.478433 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tbscw"] Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.489009 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ddnkh"] Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.491353 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ddnkh"] Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.500272 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8qmk7"] Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.512643 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-znf69"] Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.515536 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-znf69"] Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.536540 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6j79n"] Sep 30 17:06:10 crc kubenswrapper[4772]: I0930 17:06:10.541063 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6j79n"] Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.211811 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" event={"ID":"aae4ed0a-da1e-4581-913e-1c3c8c1554cc","Type":"ContainerStarted","Data":"968884a561b8d0f9844194467c259afb45a54072495bd3c20375be08de587b24"} Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.211858 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" event={"ID":"aae4ed0a-da1e-4581-913e-1c3c8c1554cc","Type":"ContainerStarted","Data":"4d322754ca8df9f2ea0c8ccdba69178691a6b73a9c73bde5a28e527073fa47a3"} Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.250177 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" podStartSLOduration=2.250147924 podStartE2EDuration="2.250147924s" podCreationTimestamp="2025-09-30 17:06:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:06:11.229187516 +0000 UTC m=+272.136200357" watchObservedRunningTime="2025-09-30 17:06:11.250147924 +0000 UTC m=+272.157160755" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.700505 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-22cvm"] Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701027 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701039 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701048 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerName="extract-content" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701072 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerName="extract-content" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701083 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701089 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701100 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701105 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701116 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerName="extract-utilities" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701122 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerName="extract-utilities" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701130 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerName="extract-utilities" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701136 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerName="extract-utilities" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701199 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e905513-23f6-4e8f-95df-0668beaad53d" containerName="marketplace-operator" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701206 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e905513-23f6-4e8f-95df-0668beaad53d" containerName="marketplace-operator" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701213 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerName="extract-content" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701219 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerName="extract-content" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701231 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701238 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701247 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerName="extract-content" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701252 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerName="extract-content" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701261 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerName="extract-utilities" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701267 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerName="extract-utilities" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701273 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerName="extract-content" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701278 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerName="extract-content" Sep 30 17:06:11 crc kubenswrapper[4772]: E0930 17:06:11.701288 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerName="extract-utilities" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701293 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerName="extract-utilities" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701379 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e905513-23f6-4e8f-95df-0668beaad53d" containerName="marketplace-operator" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701388 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701396 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701405 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.701416 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" containerName="registry-server" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.702192 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.704541 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.711913 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-22cvm"] Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.823979 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5xg5\" (UniqueName: \"kubernetes.io/projected/c7d4b164-e082-47f7-ab01-643d7bb3788b-kube-api-access-v5xg5\") pod \"redhat-marketplace-22cvm\" (UID: \"c7d4b164-e082-47f7-ab01-643d7bb3788b\") " pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.824051 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d4b164-e082-47f7-ab01-643d7bb3788b-catalog-content\") pod \"redhat-marketplace-22cvm\" (UID: \"c7d4b164-e082-47f7-ab01-643d7bb3788b\") " pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.824081 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d4b164-e082-47f7-ab01-643d7bb3788b-utilities\") pod \"redhat-marketplace-22cvm\" (UID: \"c7d4b164-e082-47f7-ab01-643d7bb3788b\") " pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.908798 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35d2ea14-6885-4373-b795-4e4714b4a2ff" path="/var/lib/kubelet/pods/35d2ea14-6885-4373-b795-4e4714b4a2ff/volumes" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.909435 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee62041-3f62-4daa-b7cc-9b8ec568bc61" path="/var/lib/kubelet/pods/4ee62041-3f62-4daa-b7cc-9b8ec568bc61/volumes" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.910033 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e905513-23f6-4e8f-95df-0668beaad53d" path="/var/lib/kubelet/pods/6e905513-23f6-4e8f-95df-0668beaad53d/volumes" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.912233 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f701b8e-c15e-48f0-a732-fba005c98ff7" path="/var/lib/kubelet/pods/8f701b8e-c15e-48f0-a732-fba005c98ff7/volumes" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.913021 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc49ca3c-00ac-47d2-abd4-74436beb8c45" path="/var/lib/kubelet/pods/cc49ca3c-00ac-47d2-abd4-74436beb8c45/volumes" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.913531 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4zvrj"] Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.915650 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4zvrj"] Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.915740 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.918672 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.925914 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d4b164-e082-47f7-ab01-643d7bb3788b-catalog-content\") pod \"redhat-marketplace-22cvm\" (UID: \"c7d4b164-e082-47f7-ab01-643d7bb3788b\") " pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.925958 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d4b164-e082-47f7-ab01-643d7bb3788b-utilities\") pod \"redhat-marketplace-22cvm\" (UID: \"c7d4b164-e082-47f7-ab01-643d7bb3788b\") " pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.926028 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5xg5\" (UniqueName: \"kubernetes.io/projected/c7d4b164-e082-47f7-ab01-643d7bb3788b-kube-api-access-v5xg5\") pod \"redhat-marketplace-22cvm\" (UID: \"c7d4b164-e082-47f7-ab01-643d7bb3788b\") " pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.926515 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d4b164-e082-47f7-ab01-643d7bb3788b-utilities\") pod \"redhat-marketplace-22cvm\" (UID: \"c7d4b164-e082-47f7-ab01-643d7bb3788b\") " pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.927200 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d4b164-e082-47f7-ab01-643d7bb3788b-catalog-content\") pod \"redhat-marketplace-22cvm\" (UID: \"c7d4b164-e082-47f7-ab01-643d7bb3788b\") " pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:11 crc kubenswrapper[4772]: I0930 17:06:11.949513 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5xg5\" (UniqueName: \"kubernetes.io/projected/c7d4b164-e082-47f7-ab01-643d7bb3788b-kube-api-access-v5xg5\") pod \"redhat-marketplace-22cvm\" (UID: \"c7d4b164-e082-47f7-ab01-643d7bb3788b\") " pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.027603 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qw9\" (UniqueName: \"kubernetes.io/projected/ee561213-a3c6-4429-9f8d-f670a07494c5-kube-api-access-q9qw9\") pod \"redhat-operators-4zvrj\" (UID: \"ee561213-a3c6-4429-9f8d-f670a07494c5\") " pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.027651 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee561213-a3c6-4429-9f8d-f670a07494c5-catalog-content\") pod \"redhat-operators-4zvrj\" (UID: \"ee561213-a3c6-4429-9f8d-f670a07494c5\") " pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.027715 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee561213-a3c6-4429-9f8d-f670a07494c5-utilities\") pod \"redhat-operators-4zvrj\" (UID: \"ee561213-a3c6-4429-9f8d-f670a07494c5\") " pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.034373 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.133845 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qw9\" (UniqueName: \"kubernetes.io/projected/ee561213-a3c6-4429-9f8d-f670a07494c5-kube-api-access-q9qw9\") pod \"redhat-operators-4zvrj\" (UID: \"ee561213-a3c6-4429-9f8d-f670a07494c5\") " pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.134259 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee561213-a3c6-4429-9f8d-f670a07494c5-catalog-content\") pod \"redhat-operators-4zvrj\" (UID: \"ee561213-a3c6-4429-9f8d-f670a07494c5\") " pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.134314 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee561213-a3c6-4429-9f8d-f670a07494c5-utilities\") pod \"redhat-operators-4zvrj\" (UID: \"ee561213-a3c6-4429-9f8d-f670a07494c5\") " pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.134717 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee561213-a3c6-4429-9f8d-f670a07494c5-catalog-content\") pod \"redhat-operators-4zvrj\" (UID: \"ee561213-a3c6-4429-9f8d-f670a07494c5\") " pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.134762 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee561213-a3c6-4429-9f8d-f670a07494c5-utilities\") pod \"redhat-operators-4zvrj\" (UID: \"ee561213-a3c6-4429-9f8d-f670a07494c5\") " pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.152678 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qw9\" (UniqueName: \"kubernetes.io/projected/ee561213-a3c6-4429-9f8d-f670a07494c5-kube-api-access-q9qw9\") pod \"redhat-operators-4zvrj\" (UID: \"ee561213-a3c6-4429-9f8d-f670a07494c5\") " pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.228805 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.230637 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-22cvm"] Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.230872 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.232346 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8qmk7" Sep 30 17:06:12 crc kubenswrapper[4772]: W0930 17:06:12.244298 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d4b164_e082_47f7_ab01_643d7bb3788b.slice/crio-723e69b34c27c69b63556061c596dbfb875af930632bf297e4e1ba1092866a14 WatchSource:0}: Error finding container 723e69b34c27c69b63556061c596dbfb875af930632bf297e4e1ba1092866a14: Status 404 returned error can't find the container with id 723e69b34c27c69b63556061c596dbfb875af930632bf297e4e1ba1092866a14 Sep 30 17:06:12 crc kubenswrapper[4772]: I0930 17:06:12.651542 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4zvrj"] Sep 30 17:06:12 crc kubenswrapper[4772]: W0930 17:06:12.655406 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee561213_a3c6_4429_9f8d_f670a07494c5.slice/crio-5fda453bf9925597a595719f3cad7ee2fff89a652e32c248b723a7e062d1353c WatchSource:0}: Error finding container 5fda453bf9925597a595719f3cad7ee2fff89a652e32c248b723a7e062d1353c: Status 404 returned error can't find the container with id 5fda453bf9925597a595719f3cad7ee2fff89a652e32c248b723a7e062d1353c Sep 30 17:06:13 crc kubenswrapper[4772]: I0930 17:06:13.235476 4772 generic.go:334] "Generic (PLEG): container finished" podID="ee561213-a3c6-4429-9f8d-f670a07494c5" containerID="fbc2577cadad1ea0ee112bf4e1fee6af1ec6f5d6e8f6732c537db863a3ef670b" exitCode=0 Sep 30 17:06:13 crc kubenswrapper[4772]: I0930 17:06:13.235593 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zvrj" event={"ID":"ee561213-a3c6-4429-9f8d-f670a07494c5","Type":"ContainerDied","Data":"fbc2577cadad1ea0ee112bf4e1fee6af1ec6f5d6e8f6732c537db863a3ef670b"} Sep 30 17:06:13 crc kubenswrapper[4772]: I0930 17:06:13.235990 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zvrj" event={"ID":"ee561213-a3c6-4429-9f8d-f670a07494c5","Type":"ContainerStarted","Data":"5fda453bf9925597a595719f3cad7ee2fff89a652e32c248b723a7e062d1353c"} Sep 30 17:06:13 crc kubenswrapper[4772]: I0930 17:06:13.237419 4772 generic.go:334] "Generic (PLEG): container finished" podID="c7d4b164-e082-47f7-ab01-643d7bb3788b" containerID="82980615be00a1720f6dd5a05c73ddbfbf5346862a792a50a610f30bacdd185c" exitCode=0 Sep 30 17:06:13 crc kubenswrapper[4772]: I0930 17:06:13.238076 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22cvm" event={"ID":"c7d4b164-e082-47f7-ab01-643d7bb3788b","Type":"ContainerDied","Data":"82980615be00a1720f6dd5a05c73ddbfbf5346862a792a50a610f30bacdd185c"} Sep 30 17:06:13 crc kubenswrapper[4772]: I0930 17:06:13.238104 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22cvm" event={"ID":"c7d4b164-e082-47f7-ab01-643d7bb3788b","Type":"ContainerStarted","Data":"723e69b34c27c69b63556061c596dbfb875af930632bf297e4e1ba1092866a14"} Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.100970 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cl5mx"] Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.103184 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.105668 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.112286 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cl5mx"] Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.167850 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71ddc7a-9a49-4cd9-842a-c6f24957a6e3-utilities\") pod \"community-operators-cl5mx\" (UID: \"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3\") " pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.167936 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vrdn\" (UniqueName: \"kubernetes.io/projected/a71ddc7a-9a49-4cd9-842a-c6f24957a6e3-kube-api-access-4vrdn\") pod \"community-operators-cl5mx\" (UID: \"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3\") " pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.168013 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71ddc7a-9a49-4cd9-842a-c6f24957a6e3-catalog-content\") pod \"community-operators-cl5mx\" (UID: \"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3\") " pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.246505 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22cvm" event={"ID":"c7d4b164-e082-47f7-ab01-643d7bb3788b","Type":"ContainerStarted","Data":"a429ae24f0ba969901830dc6248eca1bb26a6f548d695144818923d1ddbee119"} Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.269637 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vrdn\" (UniqueName: \"kubernetes.io/projected/a71ddc7a-9a49-4cd9-842a-c6f24957a6e3-kube-api-access-4vrdn\") pod \"community-operators-cl5mx\" (UID: \"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3\") " pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.269732 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71ddc7a-9a49-4cd9-842a-c6f24957a6e3-catalog-content\") pod \"community-operators-cl5mx\" (UID: \"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3\") " pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.269769 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71ddc7a-9a49-4cd9-842a-c6f24957a6e3-utilities\") pod \"community-operators-cl5mx\" (UID: \"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3\") " pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.270261 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71ddc7a-9a49-4cd9-842a-c6f24957a6e3-utilities\") pod \"community-operators-cl5mx\" (UID: \"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3\") " pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.270664 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71ddc7a-9a49-4cd9-842a-c6f24957a6e3-catalog-content\") pod \"community-operators-cl5mx\" (UID: \"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3\") " pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.292554 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vrdn\" (UniqueName: \"kubernetes.io/projected/a71ddc7a-9a49-4cd9-842a-c6f24957a6e3-kube-api-access-4vrdn\") pod \"community-operators-cl5mx\" (UID: \"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3\") " pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.302317 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7knsg"] Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.303716 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.311852 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7knsg"] Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.313095 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.371154 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9djd\" (UniqueName: \"kubernetes.io/projected/1522b5bf-cf61-4f95-a15c-63245f3eab54-kube-api-access-p9djd\") pod \"certified-operators-7knsg\" (UID: \"1522b5bf-cf61-4f95-a15c-63245f3eab54\") " pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.371295 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1522b5bf-cf61-4f95-a15c-63245f3eab54-utilities\") pod \"certified-operators-7knsg\" (UID: \"1522b5bf-cf61-4f95-a15c-63245f3eab54\") " pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.371323 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1522b5bf-cf61-4f95-a15c-63245f3eab54-catalog-content\") pod \"certified-operators-7knsg\" (UID: \"1522b5bf-cf61-4f95-a15c-63245f3eab54\") " pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.443120 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.472284 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1522b5bf-cf61-4f95-a15c-63245f3eab54-utilities\") pod \"certified-operators-7knsg\" (UID: \"1522b5bf-cf61-4f95-a15c-63245f3eab54\") " pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.472357 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1522b5bf-cf61-4f95-a15c-63245f3eab54-catalog-content\") pod \"certified-operators-7knsg\" (UID: \"1522b5bf-cf61-4f95-a15c-63245f3eab54\") " pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.472422 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9djd\" (UniqueName: \"kubernetes.io/projected/1522b5bf-cf61-4f95-a15c-63245f3eab54-kube-api-access-p9djd\") pod \"certified-operators-7knsg\" (UID: \"1522b5bf-cf61-4f95-a15c-63245f3eab54\") " pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.473761 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1522b5bf-cf61-4f95-a15c-63245f3eab54-catalog-content\") pod \"certified-operators-7knsg\" (UID: \"1522b5bf-cf61-4f95-a15c-63245f3eab54\") " pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.473802 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1522b5bf-cf61-4f95-a15c-63245f3eab54-utilities\") pod \"certified-operators-7knsg\" (UID: \"1522b5bf-cf61-4f95-a15c-63245f3eab54\") " pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.492471 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9djd\" (UniqueName: \"kubernetes.io/projected/1522b5bf-cf61-4f95-a15c-63245f3eab54-kube-api-access-p9djd\") pod \"certified-operators-7knsg\" (UID: \"1522b5bf-cf61-4f95-a15c-63245f3eab54\") " pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.646711 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cl5mx"] Sep 30 17:06:14 crc kubenswrapper[4772]: I0930 17:06:14.658368 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:15 crc kubenswrapper[4772]: I0930 17:06:15.043887 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7knsg"] Sep 30 17:06:15 crc kubenswrapper[4772]: I0930 17:06:15.253133 4772 generic.go:334] "Generic (PLEG): container finished" podID="ee561213-a3c6-4429-9f8d-f670a07494c5" containerID="2ed33c47f58af2f1f097f9950ba153effc10a72aa3bedf8b02028c0f4769dfa2" exitCode=0 Sep 30 17:06:15 crc kubenswrapper[4772]: I0930 17:06:15.253196 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zvrj" event={"ID":"ee561213-a3c6-4429-9f8d-f670a07494c5","Type":"ContainerDied","Data":"2ed33c47f58af2f1f097f9950ba153effc10a72aa3bedf8b02028c0f4769dfa2"} Sep 30 17:06:15 crc kubenswrapper[4772]: I0930 17:06:15.260004 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22cvm" event={"ID":"c7d4b164-e082-47f7-ab01-643d7bb3788b","Type":"ContainerDied","Data":"a429ae24f0ba969901830dc6248eca1bb26a6f548d695144818923d1ddbee119"} Sep 30 17:06:15 crc kubenswrapper[4772]: I0930 17:06:15.260107 4772 generic.go:334] "Generic (PLEG): container finished" podID="c7d4b164-e082-47f7-ab01-643d7bb3788b" containerID="a429ae24f0ba969901830dc6248eca1bb26a6f548d695144818923d1ddbee119" exitCode=0 Sep 30 17:06:15 crc kubenswrapper[4772]: I0930 17:06:15.262149 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7knsg" event={"ID":"1522b5bf-cf61-4f95-a15c-63245f3eab54","Type":"ContainerStarted","Data":"8ca7b601af3c23ed2e92919704549ac1a7b912807ef598813a2604bad0cec357"} Sep 30 17:06:15 crc kubenswrapper[4772]: I0930 17:06:15.264169 4772 generic.go:334] "Generic (PLEG): container finished" podID="a71ddc7a-9a49-4cd9-842a-c6f24957a6e3" containerID="85b55f1156365e0495dbbaba16d76b2a50dc6c9c6b3ddd3e982728b3f4b48c89" exitCode=0 Sep 30 17:06:15 crc kubenswrapper[4772]: I0930 17:06:15.264211 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl5mx" event={"ID":"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3","Type":"ContainerDied","Data":"85b55f1156365e0495dbbaba16d76b2a50dc6c9c6b3ddd3e982728b3f4b48c89"} Sep 30 17:06:15 crc kubenswrapper[4772]: I0930 17:06:15.264234 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl5mx" event={"ID":"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3","Type":"ContainerStarted","Data":"8f8250b66b5eac68225822414aebb5cb3d10fc72869c0d0bd071775e02a1a404"} Sep 30 17:06:16 crc kubenswrapper[4772]: I0930 17:06:16.272626 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl5mx" event={"ID":"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3","Type":"ContainerStarted","Data":"86a4a6b4e55ca4c1d907ca0bc0cb414b4ad62dae0076f7901b49f35b224f104e"} Sep 30 17:06:16 crc kubenswrapper[4772]: I0930 17:06:16.275271 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zvrj" event={"ID":"ee561213-a3c6-4429-9f8d-f670a07494c5","Type":"ContainerStarted","Data":"ae723b7b27a9b132b7a715f8d877ce9cb8f9633c9cf278812ad094b0e92f22f2"} Sep 30 17:06:16 crc kubenswrapper[4772]: I0930 17:06:16.280298 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22cvm" event={"ID":"c7d4b164-e082-47f7-ab01-643d7bb3788b","Type":"ContainerStarted","Data":"9dee278fe3779d7998d67f3b6a68a069eee28038ba392a4b178132a5f2445462"} Sep 30 17:06:16 crc kubenswrapper[4772]: I0930 17:06:16.282301 4772 generic.go:334] "Generic (PLEG): container finished" podID="1522b5bf-cf61-4f95-a15c-63245f3eab54" containerID="f6eb93d6ddd2626f3b5b56a261a754d80fbd8e791e5b2b9bb23bb0763d97f2e5" exitCode=0 Sep 30 17:06:16 crc kubenswrapper[4772]: I0930 17:06:16.282360 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7knsg" event={"ID":"1522b5bf-cf61-4f95-a15c-63245f3eab54","Type":"ContainerDied","Data":"f6eb93d6ddd2626f3b5b56a261a754d80fbd8e791e5b2b9bb23bb0763d97f2e5"} Sep 30 17:06:16 crc kubenswrapper[4772]: I0930 17:06:16.318335 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-22cvm" podStartSLOduration=2.8342152 podStartE2EDuration="5.318318456s" podCreationTimestamp="2025-09-30 17:06:11 +0000 UTC" firstStartedPulling="2025-09-30 17:06:13.238679995 +0000 UTC m=+274.145692826" lastFinishedPulling="2025-09-30 17:06:15.722783251 +0000 UTC m=+276.629796082" observedRunningTime="2025-09-30 17:06:16.315732468 +0000 UTC m=+277.222745309" watchObservedRunningTime="2025-09-30 17:06:16.318318456 +0000 UTC m=+277.225331287" Sep 30 17:06:16 crc kubenswrapper[4772]: I0930 17:06:16.367906 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4zvrj" podStartSLOduration=2.883282753 podStartE2EDuration="5.367886511s" podCreationTimestamp="2025-09-30 17:06:11 +0000 UTC" firstStartedPulling="2025-09-30 17:06:13.237137225 +0000 UTC m=+274.144150056" lastFinishedPulling="2025-09-30 17:06:15.721740983 +0000 UTC m=+276.628753814" observedRunningTime="2025-09-30 17:06:16.36478587 +0000 UTC m=+277.271798701" watchObservedRunningTime="2025-09-30 17:06:16.367886511 +0000 UTC m=+277.274899342" Sep 30 17:06:17 crc kubenswrapper[4772]: I0930 17:06:17.292358 4772 generic.go:334] "Generic (PLEG): container finished" podID="a71ddc7a-9a49-4cd9-842a-c6f24957a6e3" containerID="86a4a6b4e55ca4c1d907ca0bc0cb414b4ad62dae0076f7901b49f35b224f104e" exitCode=0 Sep 30 17:06:17 crc kubenswrapper[4772]: I0930 17:06:17.292466 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl5mx" event={"ID":"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3","Type":"ContainerDied","Data":"86a4a6b4e55ca4c1d907ca0bc0cb414b4ad62dae0076f7901b49f35b224f104e"} Sep 30 17:06:18 crc kubenswrapper[4772]: I0930 17:06:18.300947 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl5mx" event={"ID":"a71ddc7a-9a49-4cd9-842a-c6f24957a6e3","Type":"ContainerStarted","Data":"587e20c0a6d008bb6542b5340fb4f517e3dce8153e45629bbb1a9929baad4575"} Sep 30 17:06:18 crc kubenswrapper[4772]: I0930 17:06:18.304773 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7knsg" event={"ID":"1522b5bf-cf61-4f95-a15c-63245f3eab54","Type":"ContainerStarted","Data":"243b81e35c04613761052aae543c348063812550364b84cbc57e11f9644d9b72"} Sep 30 17:06:18 crc kubenswrapper[4772]: I0930 17:06:18.328234 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cl5mx" podStartSLOduration=1.60567927 podStartE2EDuration="4.328214366s" podCreationTimestamp="2025-09-30 17:06:14 +0000 UTC" firstStartedPulling="2025-09-30 17:06:15.265476208 +0000 UTC m=+276.172489039" lastFinishedPulling="2025-09-30 17:06:17.988011304 +0000 UTC m=+278.895024135" observedRunningTime="2025-09-30 17:06:18.326770268 +0000 UTC m=+279.233783099" watchObservedRunningTime="2025-09-30 17:06:18.328214366 +0000 UTC m=+279.235227197" Sep 30 17:06:19 crc kubenswrapper[4772]: I0930 17:06:19.312527 4772 generic.go:334] "Generic (PLEG): container finished" podID="1522b5bf-cf61-4f95-a15c-63245f3eab54" containerID="243b81e35c04613761052aae543c348063812550364b84cbc57e11f9644d9b72" exitCode=0 Sep 30 17:06:19 crc kubenswrapper[4772]: I0930 17:06:19.312592 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7knsg" event={"ID":"1522b5bf-cf61-4f95-a15c-63245f3eab54","Type":"ContainerDied","Data":"243b81e35c04613761052aae543c348063812550364b84cbc57e11f9644d9b72"} Sep 30 17:06:20 crc kubenswrapper[4772]: I0930 17:06:20.319826 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7knsg" event={"ID":"1522b5bf-cf61-4f95-a15c-63245f3eab54","Type":"ContainerStarted","Data":"04f53261c182581080599e26cc4f53d03b7133dfc026c5170971b3923918c483"} Sep 30 17:06:20 crc kubenswrapper[4772]: I0930 17:06:20.346572 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7knsg" podStartSLOduration=2.645186498 podStartE2EDuration="6.346551556s" podCreationTimestamp="2025-09-30 17:06:14 +0000 UTC" firstStartedPulling="2025-09-30 17:06:16.283739962 +0000 UTC m=+277.190752793" lastFinishedPulling="2025-09-30 17:06:19.98510502 +0000 UTC m=+280.892117851" observedRunningTime="2025-09-30 17:06:20.343069035 +0000 UTC m=+281.250081876" watchObservedRunningTime="2025-09-30 17:06:20.346551556 +0000 UTC m=+281.253564387" Sep 30 17:06:22 crc kubenswrapper[4772]: I0930 17:06:22.034902 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:22 crc kubenswrapper[4772]: I0930 17:06:22.035556 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:22 crc kubenswrapper[4772]: I0930 17:06:22.087446 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:22 crc kubenswrapper[4772]: I0930 17:06:22.231476 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:22 crc kubenswrapper[4772]: I0930 17:06:22.231556 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:22 crc kubenswrapper[4772]: I0930 17:06:22.279135 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:22 crc kubenswrapper[4772]: I0930 17:06:22.369645 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4zvrj" Sep 30 17:06:22 crc kubenswrapper[4772]: I0930 17:06:22.380999 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-22cvm" Sep 30 17:06:24 crc kubenswrapper[4772]: I0930 17:06:24.443538 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:24 crc kubenswrapper[4772]: I0930 17:06:24.444361 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:24 crc kubenswrapper[4772]: I0930 17:06:24.489869 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:06:24 crc kubenswrapper[4772]: I0930 17:06:24.658452 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:24 crc kubenswrapper[4772]: I0930 17:06:24.658513 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:24 crc kubenswrapper[4772]: I0930 17:06:24.700966 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:25 crc kubenswrapper[4772]: I0930 17:06:25.397802 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7knsg" Sep 30 17:06:25 crc kubenswrapper[4772]: I0930 17:06:25.409357 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cl5mx" Sep 30 17:07:08 crc kubenswrapper[4772]: I0930 17:07:08.655835 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:07:08 crc kubenswrapper[4772]: I0930 17:07:08.656517 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:07:38 crc kubenswrapper[4772]: I0930 17:07:38.655995 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:07:38 crc kubenswrapper[4772]: I0930 17:07:38.656652 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:08:08 crc kubenswrapper[4772]: I0930 17:08:08.655591 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:08:08 crc kubenswrapper[4772]: I0930 17:08:08.656335 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:08:08 crc kubenswrapper[4772]: I0930 17:08:08.656404 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:08:08 crc kubenswrapper[4772]: I0930 17:08:08.657165 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f10d5a8a8c6ce091c5f99aa8d4034ddb62a154de5d305c39bd7a051c8a0375f6"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:08:08 crc kubenswrapper[4772]: I0930 17:08:08.657230 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://f10d5a8a8c6ce091c5f99aa8d4034ddb62a154de5d305c39bd7a051c8a0375f6" gracePeriod=600 Sep 30 17:08:08 crc kubenswrapper[4772]: I0930 17:08:08.956673 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="f10d5a8a8c6ce091c5f99aa8d4034ddb62a154de5d305c39bd7a051c8a0375f6" exitCode=0 Sep 30 17:08:08 crc kubenswrapper[4772]: I0930 17:08:08.956719 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"f10d5a8a8c6ce091c5f99aa8d4034ddb62a154de5d305c39bd7a051c8a0375f6"} Sep 30 17:08:08 crc kubenswrapper[4772]: I0930 17:08:08.956761 4772 scope.go:117] "RemoveContainer" containerID="bfc9d25ec3fa1b98248a3c3ff5c8bdbfdcee0e6a373f0cdbafaae15211dad816" Sep 30 17:08:09 crc kubenswrapper[4772]: I0930 17:08:09.965287 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"8d300de23ff5fdda967fc356bca4e8a110fd4878bedac23ae19b92c618fe6c8a"} Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.389457 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cqs5r"] Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.390886 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.416899 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cqs5r"] Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.555306 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.555369 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-trusted-ca\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.555401 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxfgn\" (UniqueName: \"kubernetes.io/projected/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-kube-api-access-xxfgn\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.555434 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-registry-tls\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.555463 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-registry-certificates\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.555505 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.555529 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.555560 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-bound-sa-token\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.578213 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.656664 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-bound-sa-token\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.656777 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.656799 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-trusted-ca\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.656824 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxfgn\" (UniqueName: \"kubernetes.io/projected/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-kube-api-access-xxfgn\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.656856 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-registry-tls\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.656889 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-registry-certificates\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.656920 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.657408 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.658793 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-registry-certificates\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.660036 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-trusted-ca\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.665025 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.665149 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-registry-tls\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.679960 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxfgn\" (UniqueName: \"kubernetes.io/projected/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-kube-api-access-xxfgn\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.680313 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ca3858a-f924-4e7a-8406-42ddd59ed6c5-bound-sa-token\") pod \"image-registry-66df7c8f76-cqs5r\" (UID: \"6ca3858a-f924-4e7a-8406-42ddd59ed6c5\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.706874 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:51 crc kubenswrapper[4772]: I0930 17:08:51.974719 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cqs5r"] Sep 30 17:08:52 crc kubenswrapper[4772]: I0930 17:08:52.227102 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" event={"ID":"6ca3858a-f924-4e7a-8406-42ddd59ed6c5","Type":"ContainerStarted","Data":"917b0d31fc999dc4039cfc068c3353f1367847e19fc4bcd55e90cdd301270093"} Sep 30 17:08:53 crc kubenswrapper[4772]: I0930 17:08:53.234647 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" event={"ID":"6ca3858a-f924-4e7a-8406-42ddd59ed6c5","Type":"ContainerStarted","Data":"08a2b7759b29c92cf551f01cb677852af371feb64577fb9b9cdb13211618beff"} Sep 30 17:08:53 crc kubenswrapper[4772]: I0930 17:08:53.234835 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:08:53 crc kubenswrapper[4772]: I0930 17:08:53.263588 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" podStartSLOduration=2.263565618 podStartE2EDuration="2.263565618s" podCreationTimestamp="2025-09-30 17:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:08:53.263045455 +0000 UTC m=+434.170058326" watchObservedRunningTime="2025-09-30 17:08:53.263565618 +0000 UTC m=+434.170578439" Sep 30 17:09:11 crc kubenswrapper[4772]: I0930 17:09:11.716725 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-cqs5r" Sep 30 17:09:11 crc kubenswrapper[4772]: I0930 17:09:11.780777 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2gtql"] Sep 30 17:09:36 crc kubenswrapper[4772]: I0930 17:09:36.835176 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" podUID="38933d3b-1f86-415d-923c-c8366e93021f" containerName="registry" containerID="cri-o://c92641fc3906b5aa25a2e94584b352662ea536f555e071692f4ba237ba88a30a" gracePeriod=30 Sep 30 17:09:37 crc kubenswrapper[4772]: I0930 17:09:37.489074 4772 generic.go:334] "Generic (PLEG): container finished" podID="38933d3b-1f86-415d-923c-c8366e93021f" containerID="c92641fc3906b5aa25a2e94584b352662ea536f555e071692f4ba237ba88a30a" exitCode=0 Sep 30 17:09:37 crc kubenswrapper[4772]: I0930 17:09:37.489127 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" event={"ID":"38933d3b-1f86-415d-923c-c8366e93021f","Type":"ContainerDied","Data":"c92641fc3906b5aa25a2e94584b352662ea536f555e071692f4ba237ba88a30a"} Sep 30 17:09:37 crc kubenswrapper[4772]: I0930 17:09:37.916779 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.043893 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38933d3b-1f86-415d-923c-c8366e93021f-installation-pull-secrets\") pod \"38933d3b-1f86-415d-923c-c8366e93021f\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.043965 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38933d3b-1f86-415d-923c-c8366e93021f-ca-trust-extracted\") pod \"38933d3b-1f86-415d-923c-c8366e93021f\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.044008 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-trusted-ca\") pod \"38933d3b-1f86-415d-923c-c8366e93021f\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.044035 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-bound-sa-token\") pod \"38933d3b-1f86-415d-923c-c8366e93021f\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.044073 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-registry-certificates\") pod \"38933d3b-1f86-415d-923c-c8366e93021f\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.044100 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hldm\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-kube-api-access-5hldm\") pod \"38933d3b-1f86-415d-923c-c8366e93021f\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.044260 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"38933d3b-1f86-415d-923c-c8366e93021f\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.044288 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-registry-tls\") pod \"38933d3b-1f86-415d-923c-c8366e93021f\" (UID: \"38933d3b-1f86-415d-923c-c8366e93021f\") " Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.045373 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "38933d3b-1f86-415d-923c-c8366e93021f" (UID: "38933d3b-1f86-415d-923c-c8366e93021f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.045427 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "38933d3b-1f86-415d-923c-c8366e93021f" (UID: "38933d3b-1f86-415d-923c-c8366e93021f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.059342 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-kube-api-access-5hldm" (OuterVolumeSpecName: "kube-api-access-5hldm") pod "38933d3b-1f86-415d-923c-c8366e93021f" (UID: "38933d3b-1f86-415d-923c-c8366e93021f"). InnerVolumeSpecName "kube-api-access-5hldm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.059335 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38933d3b-1f86-415d-923c-c8366e93021f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "38933d3b-1f86-415d-923c-c8366e93021f" (UID: "38933d3b-1f86-415d-923c-c8366e93021f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.060427 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "38933d3b-1f86-415d-923c-c8366e93021f" (UID: "38933d3b-1f86-415d-923c-c8366e93021f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.060746 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "38933d3b-1f86-415d-923c-c8366e93021f" (UID: "38933d3b-1f86-415d-923c-c8366e93021f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.063622 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38933d3b-1f86-415d-923c-c8366e93021f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "38933d3b-1f86-415d-923c-c8366e93021f" (UID: "38933d3b-1f86-415d-923c-c8366e93021f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.076495 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "38933d3b-1f86-415d-923c-c8366e93021f" (UID: "38933d3b-1f86-415d-923c-c8366e93021f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.145777 4772 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38933d3b-1f86-415d-923c-c8366e93021f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.145822 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.145834 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.145847 4772 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38933d3b-1f86-415d-923c-c8366e93021f-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.145865 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hldm\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-kube-api-access-5hldm\") on node \"crc\" DevicePath \"\"" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.145874 4772 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38933d3b-1f86-415d-923c-c8366e93021f-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.145884 4772 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38933d3b-1f86-415d-923c-c8366e93021f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.498947 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" event={"ID":"38933d3b-1f86-415d-923c-c8366e93021f","Type":"ContainerDied","Data":"3a485eea8d8abc6ee851fd0104ed135e85d3e5e18eed7aedb14f57c9ffadfc53"} Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.498989 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2gtql" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.499003 4772 scope.go:117] "RemoveContainer" containerID="c92641fc3906b5aa25a2e94584b352662ea536f555e071692f4ba237ba88a30a" Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.546507 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2gtql"] Sep 30 17:09:38 crc kubenswrapper[4772]: I0930 17:09:38.552031 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2gtql"] Sep 30 17:09:39 crc kubenswrapper[4772]: I0930 17:09:39.924525 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38933d3b-1f86-415d-923c-c8366e93021f" path="/var/lib/kubelet/pods/38933d3b-1f86-415d-923c-c8366e93021f/volumes" Sep 30 17:10:38 crc kubenswrapper[4772]: I0930 17:10:38.655350 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:10:38 crc kubenswrapper[4772]: I0930 17:10:38.656235 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:11:08 crc kubenswrapper[4772]: I0930 17:11:08.655369 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:11:08 crc kubenswrapper[4772]: I0930 17:11:08.656589 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:11:38 crc kubenswrapper[4772]: I0930 17:11:38.655303 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:11:38 crc kubenswrapper[4772]: I0930 17:11:38.655953 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:11:38 crc kubenswrapper[4772]: I0930 17:11:38.656007 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:11:38 crc kubenswrapper[4772]: I0930 17:11:38.656658 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d300de23ff5fdda967fc356bca4e8a110fd4878bedac23ae19b92c618fe6c8a"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:11:38 crc kubenswrapper[4772]: I0930 17:11:38.656723 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://8d300de23ff5fdda967fc356bca4e8a110fd4878bedac23ae19b92c618fe6c8a" gracePeriod=600 Sep 30 17:11:39 crc kubenswrapper[4772]: I0930 17:11:39.290756 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="8d300de23ff5fdda967fc356bca4e8a110fd4878bedac23ae19b92c618fe6c8a" exitCode=0 Sep 30 17:11:39 crc kubenswrapper[4772]: I0930 17:11:39.290850 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"8d300de23ff5fdda967fc356bca4e8a110fd4878bedac23ae19b92c618fe6c8a"} Sep 30 17:11:39 crc kubenswrapper[4772]: I0930 17:11:39.291198 4772 scope.go:117] "RemoveContainer" containerID="f10d5a8a8c6ce091c5f99aa8d4034ddb62a154de5d305c39bd7a051c8a0375f6" Sep 30 17:11:40 crc kubenswrapper[4772]: I0930 17:11:40.298417 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"9c4130d132bd9ba1e58ca9105011cc1089aeabb461da2027bde96f24d0137622"} Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.667176 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qtth7"] Sep 30 17:12:00 crc kubenswrapper[4772]: E0930 17:12:00.668653 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38933d3b-1f86-415d-923c-c8366e93021f" containerName="registry" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.668726 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="38933d3b-1f86-415d-923c-c8366e93021f" containerName="registry" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.669477 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="38933d3b-1f86-415d-923c-c8366e93021f" containerName="registry" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.672361 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qtth7" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.676292 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ws7dg"] Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.678788 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-ws7dg" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.679571 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.679835 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.680117 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-jkvq4" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.680891 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-lhb2m" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.699277 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qtth7"] Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.703494 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lkvcc"] Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.705807 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-lkvcc" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.710664 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxdh8\" (UniqueName: \"kubernetes.io/projected/9e9dcd73-971e-4f2f-869a-317159d2c9a5-kube-api-access-xxdh8\") pod \"cert-manager-webhook-5655c58dd6-lkvcc\" (UID: \"9e9dcd73-971e-4f2f-869a-317159d2c9a5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lkvcc" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.710768 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nfmc\" (UniqueName: \"kubernetes.io/projected/525413d1-592e-482c-a45a-0e88bfc94da5-kube-api-access-5nfmc\") pod \"cert-manager-cainjector-7f985d654d-ws7dg\" (UID: \"525413d1-592e-482c-a45a-0e88bfc94da5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ws7dg" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.710811 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kwgn\" (UniqueName: \"kubernetes.io/projected/75f76096-4236-46b9-8e3b-9e6784362607-kube-api-access-9kwgn\") pod \"cert-manager-5b446d88c5-qtth7\" (UID: \"75f76096-4236-46b9-8e3b-9e6784362607\") " pod="cert-manager/cert-manager-5b446d88c5-qtth7" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.715488 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ws7dg"] Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.716369 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lkvcc"] Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.717770 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-gs2c5" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.812817 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxdh8\" (UniqueName: \"kubernetes.io/projected/9e9dcd73-971e-4f2f-869a-317159d2c9a5-kube-api-access-xxdh8\") pod \"cert-manager-webhook-5655c58dd6-lkvcc\" (UID: \"9e9dcd73-971e-4f2f-869a-317159d2c9a5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lkvcc" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.813289 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nfmc\" (UniqueName: \"kubernetes.io/projected/525413d1-592e-482c-a45a-0e88bfc94da5-kube-api-access-5nfmc\") pod \"cert-manager-cainjector-7f985d654d-ws7dg\" (UID: \"525413d1-592e-482c-a45a-0e88bfc94da5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ws7dg" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.813509 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kwgn\" (UniqueName: \"kubernetes.io/projected/75f76096-4236-46b9-8e3b-9e6784362607-kube-api-access-9kwgn\") pod \"cert-manager-5b446d88c5-qtth7\" (UID: \"75f76096-4236-46b9-8e3b-9e6784362607\") " pod="cert-manager/cert-manager-5b446d88c5-qtth7" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.833961 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nfmc\" (UniqueName: \"kubernetes.io/projected/525413d1-592e-482c-a45a-0e88bfc94da5-kube-api-access-5nfmc\") pod \"cert-manager-cainjector-7f985d654d-ws7dg\" (UID: \"525413d1-592e-482c-a45a-0e88bfc94da5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ws7dg" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.834194 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxdh8\" (UniqueName: \"kubernetes.io/projected/9e9dcd73-971e-4f2f-869a-317159d2c9a5-kube-api-access-xxdh8\") pod \"cert-manager-webhook-5655c58dd6-lkvcc\" (UID: \"9e9dcd73-971e-4f2f-869a-317159d2c9a5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lkvcc" Sep 30 17:12:00 crc kubenswrapper[4772]: I0930 17:12:00.836272 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kwgn\" (UniqueName: \"kubernetes.io/projected/75f76096-4236-46b9-8e3b-9e6784362607-kube-api-access-9kwgn\") pod \"cert-manager-5b446d88c5-qtth7\" (UID: \"75f76096-4236-46b9-8e3b-9e6784362607\") " pod="cert-manager/cert-manager-5b446d88c5-qtth7" Sep 30 17:12:01 crc kubenswrapper[4772]: I0930 17:12:01.004501 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qtth7" Sep 30 17:12:01 crc kubenswrapper[4772]: I0930 17:12:01.028368 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-ws7dg" Sep 30 17:12:01 crc kubenswrapper[4772]: I0930 17:12:01.036686 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-lkvcc" Sep 30 17:12:01 crc kubenswrapper[4772]: I0930 17:12:01.259414 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qtth7"] Sep 30 17:12:01 crc kubenswrapper[4772]: I0930 17:12:01.285608 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:12:01 crc kubenswrapper[4772]: I0930 17:12:01.428072 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qtth7" event={"ID":"75f76096-4236-46b9-8e3b-9e6784362607","Type":"ContainerStarted","Data":"c19c2da67acadaf3d4b9f7f115255737650917b645bc12aeecdba1b4cce6e7d4"} Sep 30 17:12:01 crc kubenswrapper[4772]: I0930 17:12:01.527252 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ws7dg"] Sep 30 17:12:01 crc kubenswrapper[4772]: W0930 17:12:01.536557 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod525413d1_592e_482c_a45a_0e88bfc94da5.slice/crio-f09d20dfbf1707cbffb85136e93291fe787c08960819228e681502b54859aaad WatchSource:0}: Error finding container f09d20dfbf1707cbffb85136e93291fe787c08960819228e681502b54859aaad: Status 404 returned error can't find the container with id f09d20dfbf1707cbffb85136e93291fe787c08960819228e681502b54859aaad Sep 30 17:12:01 crc kubenswrapper[4772]: I0930 17:12:01.545316 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lkvcc"] Sep 30 17:12:01 crc kubenswrapper[4772]: W0930 17:12:01.549171 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e9dcd73_971e_4f2f_869a_317159d2c9a5.slice/crio-3fca964a78e756911b8329ee4ca7ae4796b9ae9ca97117d2c596197c8e99904a WatchSource:0}: Error finding container 3fca964a78e756911b8329ee4ca7ae4796b9ae9ca97117d2c596197c8e99904a: Status 404 returned error can't find the container with id 3fca964a78e756911b8329ee4ca7ae4796b9ae9ca97117d2c596197c8e99904a Sep 30 17:12:02 crc kubenswrapper[4772]: I0930 17:12:02.436662 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-ws7dg" event={"ID":"525413d1-592e-482c-a45a-0e88bfc94da5","Type":"ContainerStarted","Data":"f09d20dfbf1707cbffb85136e93291fe787c08960819228e681502b54859aaad"} Sep 30 17:12:02 crc kubenswrapper[4772]: I0930 17:12:02.439614 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-lkvcc" event={"ID":"9e9dcd73-971e-4f2f-869a-317159d2c9a5","Type":"ContainerStarted","Data":"3fca964a78e756911b8329ee4ca7ae4796b9ae9ca97117d2c596197c8e99904a"} Sep 30 17:12:04 crc kubenswrapper[4772]: I0930 17:12:04.449916 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qtth7" event={"ID":"75f76096-4236-46b9-8e3b-9e6784362607","Type":"ContainerStarted","Data":"57b01816c326a8f7c60954f4c041282362345455bb6e998bff96f2078fe3e7b8"} Sep 30 17:12:04 crc kubenswrapper[4772]: I0930 17:12:04.465364 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-qtth7" podStartSLOduration=1.600313142 podStartE2EDuration="4.465345107s" podCreationTimestamp="2025-09-30 17:12:00 +0000 UTC" firstStartedPulling="2025-09-30 17:12:01.285330723 +0000 UTC m=+622.192343544" lastFinishedPulling="2025-09-30 17:12:04.150362668 +0000 UTC m=+625.057375509" observedRunningTime="2025-09-30 17:12:04.463814097 +0000 UTC m=+625.370826928" watchObservedRunningTime="2025-09-30 17:12:04.465345107 +0000 UTC m=+625.372357948" Sep 30 17:12:08 crc kubenswrapper[4772]: I0930 17:12:08.472151 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-ws7dg" event={"ID":"525413d1-592e-482c-a45a-0e88bfc94da5","Type":"ContainerStarted","Data":"6030ebdd67b9493b0996fe739e44f55d14c5f0619ee4ff6f63a46e640bb38150"} Sep 30 17:12:08 crc kubenswrapper[4772]: I0930 17:12:08.475042 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-lkvcc" event={"ID":"9e9dcd73-971e-4f2f-869a-317159d2c9a5","Type":"ContainerStarted","Data":"6eff0901a144c3f52b3c07c6721c49915e795fbc885c10ad9063345d234e577c"} Sep 30 17:12:08 crc kubenswrapper[4772]: I0930 17:12:08.475179 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-lkvcc" Sep 30 17:12:08 crc kubenswrapper[4772]: I0930 17:12:08.489250 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-ws7dg" podStartSLOduration=2.208655223 podStartE2EDuration="8.489231093s" podCreationTimestamp="2025-09-30 17:12:00 +0000 UTC" firstStartedPulling="2025-09-30 17:12:01.539179542 +0000 UTC m=+622.446192373" lastFinishedPulling="2025-09-30 17:12:07.819755412 +0000 UTC m=+628.726768243" observedRunningTime="2025-09-30 17:12:08.485835174 +0000 UTC m=+629.392848005" watchObservedRunningTime="2025-09-30 17:12:08.489231093 +0000 UTC m=+629.396243924" Sep 30 17:12:08 crc kubenswrapper[4772]: I0930 17:12:08.500922 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-lkvcc" podStartSLOduration=2.22613119 podStartE2EDuration="8.500901068s" podCreationTimestamp="2025-09-30 17:12:00 +0000 UTC" firstStartedPulling="2025-09-30 17:12:01.55169908 +0000 UTC m=+622.458711911" lastFinishedPulling="2025-09-30 17:12:07.826468958 +0000 UTC m=+628.733481789" observedRunningTime="2025-09-30 17:12:08.500050826 +0000 UTC m=+629.407063677" watchObservedRunningTime="2025-09-30 17:12:08.500901068 +0000 UTC m=+629.407913889" Sep 30 17:12:10 crc kubenswrapper[4772]: I0930 17:12:10.928603 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bj99l"] Sep 30 17:12:10 crc kubenswrapper[4772]: I0930 17:12:10.929025 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovn-controller" containerID="cri-o://e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9" gracePeriod=30 Sep 30 17:12:10 crc kubenswrapper[4772]: I0930 17:12:10.929362 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="sbdb" containerID="cri-o://485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0" gracePeriod=30 Sep 30 17:12:10 crc kubenswrapper[4772]: I0930 17:12:10.929398 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="nbdb" containerID="cri-o://c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8" gracePeriod=30 Sep 30 17:12:10 crc kubenswrapper[4772]: I0930 17:12:10.929431 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="northd" containerID="cri-o://dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f" gracePeriod=30 Sep 30 17:12:10 crc kubenswrapper[4772]: I0930 17:12:10.929456 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b" gracePeriod=30 Sep 30 17:12:10 crc kubenswrapper[4772]: I0930 17:12:10.929526 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="kube-rbac-proxy-node" containerID="cri-o://1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70" gracePeriod=30 Sep 30 17:12:10 crc kubenswrapper[4772]: I0930 17:12:10.929559 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovn-acl-logging" containerID="cri-o://95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63" gracePeriod=30 Sep 30 17:12:10 crc kubenswrapper[4772]: I0930 17:12:10.966898 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" containerID="cri-o://8349cf0ac4454fe23d9f83ac717bce1f5de2645c6ceda50c1052f259339b3be3" gracePeriod=30 Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.207176 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.207187 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.208909 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.209683 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.210677 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.210723 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="sbdb" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.211190 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.211223 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="nbdb" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.494360 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7br52_5e5b90d4-3f5e-49d8-b2c5-175948eeeda6/kube-multus/2.log" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.494812 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7br52_5e5b90d4-3f5e-49d8-b2c5-175948eeeda6/kube-multus/1.log" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.494866 4772 generic.go:334] "Generic (PLEG): container finished" podID="5e5b90d4-3f5e-49d8-b2c5-175948eeeda6" containerID="dd0542a8a6e1f74fa0a0bfda28a793973346f624d7bfe562855a5502e5c9ce83" exitCode=2 Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.494931 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7br52" event={"ID":"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6","Type":"ContainerDied","Data":"dd0542a8a6e1f74fa0a0bfda28a793973346f624d7bfe562855a5502e5c9ce83"} Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.494989 4772 scope.go:117] "RemoveContainer" containerID="8ef1189b32001cded42b3c4fd17f81a9c4075e8b0f54d72799fa4306e83cd670" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.495517 4772 scope.go:117] "RemoveContainer" containerID="dd0542a8a6e1f74fa0a0bfda28a793973346f624d7bfe562855a5502e5c9ce83" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.496017 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7br52_openshift-multus(5e5b90d4-3f5e-49d8-b2c5-175948eeeda6)\"" pod="openshift-multus/multus-7br52" podUID="5e5b90d4-3f5e-49d8-b2c5-175948eeeda6" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.498697 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovnkube-controller/3.log" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.501170 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovn-acl-logging/0.log" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.501656 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovn-controller/0.log" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502029 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="8349cf0ac4454fe23d9f83ac717bce1f5de2645c6ceda50c1052f259339b3be3" exitCode=0 Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502049 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0" exitCode=0 Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502078 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8" exitCode=0 Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502086 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f" exitCode=0 Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502094 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b" exitCode=0 Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502101 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70" exitCode=0 Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502108 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63" exitCode=143 Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502120 4772 generic.go:334] "Generic (PLEG): container finished" podID="47daa5db-853e-45af-98ae-489980c97641" containerID="e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9" exitCode=143 Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502103 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"8349cf0ac4454fe23d9f83ac717bce1f5de2645c6ceda50c1052f259339b3be3"} Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502151 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0"} Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502162 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8"} Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502173 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f"} Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502183 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b"} Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502195 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70"} Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502204 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63"} Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.502213 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9"} Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.531540 4772 scope.go:117] "RemoveContainer" containerID="39b8c13b30c627e71f45f26ba91cd39fe177be09b97bbad24143b88aa1af31dc" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.891158 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovn-acl-logging/0.log" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.891799 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovn-controller/0.log" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.892358 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946217 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dft4s"] Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946480 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovn-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946497 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovn-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946508 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946516 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946526 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946534 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946550 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="nbdb" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946557 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="nbdb" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946570 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946579 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946589 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="sbdb" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946597 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="sbdb" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946609 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946616 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946626 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="kube-rbac-proxy-node" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946634 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="kube-rbac-proxy-node" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946644 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="northd" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946651 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="northd" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946658 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946665 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946674 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovn-acl-logging" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946686 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovn-acl-logging" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.946698 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="kubecfg-setup" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946705 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="kubecfg-setup" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946818 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="northd" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946829 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946837 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="nbdb" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946846 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946856 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946866 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="sbdb" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946879 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946889 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946897 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946910 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovn-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946922 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovn-acl-logging" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.946930 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="kube-rbac-proxy-node" Sep 30 17:12:11 crc kubenswrapper[4772]: E0930 17:12:11.947041 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.947055 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47daa5db-853e-45af-98ae-489980c97641" containerName="ovnkube-controller" Sep 30 17:12:11 crc kubenswrapper[4772]: I0930 17:12:11.949044 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002222 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47daa5db-853e-45af-98ae-489980c97641-ovn-node-metrics-cert\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002749 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-ovn\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002815 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27g86\" (UniqueName: \"kubernetes.io/projected/47daa5db-853e-45af-98ae-489980c97641-kube-api-access-27g86\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002841 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-slash\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002876 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-script-lib\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002900 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-var-lib-cni-networks-ovn-kubernetes\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002920 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-var-lib-openvswitch\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002941 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-openvswitch\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002963 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-netd\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002980 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-netns\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.002998 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-systemd\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003019 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-kubelet\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003037 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-config\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003079 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-ovn-kubernetes\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003099 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-etc-openvswitch\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003119 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-env-overrides\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003205 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003254 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003281 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003305 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003365 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003473 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003891 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004011 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004022 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004106 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004142 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-slash" (OuterVolumeSpecName: "host-slash") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003965 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004277 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.003227 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-bin\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004534 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-node-log\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004574 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-systemd-units\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004588 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004614 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-log-socket\") pod \"47daa5db-853e-45af-98ae-489980c97641\" (UID: \"47daa5db-853e-45af-98ae-489980c97641\") " Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004625 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-node-log" (OuterVolumeSpecName: "node-log") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004650 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004754 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-log-socket" (OuterVolumeSpecName: "log-socket") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004792 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d26p\" (UniqueName: \"kubernetes.io/projected/0d1cf059-248e-4747-b217-5ef7be24858c-kube-api-access-8d26p\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004824 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-slash\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004865 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-run-ovn\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004903 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-run-openvswitch\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004926 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-cni-netd\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.004965 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-run-ovn-kubernetes\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005052 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-kubelet\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005123 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-run-netns\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005147 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-cni-bin\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005181 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-log-socket\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005209 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-run-systemd\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005258 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0d1cf059-248e-4747-b217-5ef7be24858c-ovnkube-script-lib\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005321 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005348 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0d1cf059-248e-4747-b217-5ef7be24858c-env-overrides\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005374 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-etc-openvswitch\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005401 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-systemd-units\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005437 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0d1cf059-248e-4747-b217-5ef7be24858c-ovn-node-metrics-cert\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005456 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-node-log\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005479 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0d1cf059-248e-4747-b217-5ef7be24858c-ovnkube-config\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005634 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-var-lib-openvswitch\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005736 4772 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005753 4772 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005766 4772 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005777 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005791 4772 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005804 4772 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005816 4772 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005828 4772 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005840 4772 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005852 4772 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005863 4772 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005876 4772 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005888 4772 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005900 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47daa5db-853e-45af-98ae-489980c97641-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005912 4772 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005925 4772 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.005937 4772 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.008841 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47daa5db-853e-45af-98ae-489980c97641-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.009208 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47daa5db-853e-45af-98ae-489980c97641-kube-api-access-27g86" (OuterVolumeSpecName: "kube-api-access-27g86") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "kube-api-access-27g86". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.016973 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "47daa5db-853e-45af-98ae-489980c97641" (UID: "47daa5db-853e-45af-98ae-489980c97641"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.106948 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107013 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0d1cf059-248e-4747-b217-5ef7be24858c-env-overrides\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107042 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-etc-openvswitch\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107093 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-systemd-units\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107115 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107172 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-etc-openvswitch\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107125 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0d1cf059-248e-4747-b217-5ef7be24858c-ovn-node-metrics-cert\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107205 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-systemd-units\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107260 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-node-log\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107290 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0d1cf059-248e-4747-b217-5ef7be24858c-ovnkube-config\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107306 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-node-log\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107352 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-var-lib-openvswitch\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107401 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d26p\" (UniqueName: \"kubernetes.io/projected/0d1cf059-248e-4747-b217-5ef7be24858c-kube-api-access-8d26p\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107423 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-slash\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107443 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-run-ovn\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107483 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-run-openvswitch\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107503 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-cni-netd\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107547 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-run-ovn-kubernetes\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107576 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-kubelet\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107580 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-var-lib-openvswitch\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107598 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-run-netns\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107657 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-slash\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107674 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-cni-netd\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107691 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-cni-bin\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107698 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-run-ovn-kubernetes\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107664 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-cni-bin\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107718 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-run-openvswitch\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107742 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-run-netns\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107613 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-run-ovn\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107781 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-log-socket\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107763 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-log-socket\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107814 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-host-kubelet\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107878 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-run-systemd\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107953 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0d1cf059-248e-4747-b217-5ef7be24858c-run-systemd\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.107987 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0d1cf059-248e-4747-b217-5ef7be24858c-ovnkube-script-lib\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.108150 4772 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47daa5db-853e-45af-98ae-489980c97641-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.108168 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47daa5db-853e-45af-98ae-489980c97641-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.108181 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27g86\" (UniqueName: \"kubernetes.io/projected/47daa5db-853e-45af-98ae-489980c97641-kube-api-access-27g86\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.108675 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0d1cf059-248e-4747-b217-5ef7be24858c-env-overrides\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.108703 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0d1cf059-248e-4747-b217-5ef7be24858c-ovnkube-config\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.108716 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0d1cf059-248e-4747-b217-5ef7be24858c-ovnkube-script-lib\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.113920 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0d1cf059-248e-4747-b217-5ef7be24858c-ovn-node-metrics-cert\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.125339 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d26p\" (UniqueName: \"kubernetes.io/projected/0d1cf059-248e-4747-b217-5ef7be24858c-kube-api-access-8d26p\") pod \"ovnkube-node-dft4s\" (UID: \"0d1cf059-248e-4747-b217-5ef7be24858c\") " pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.261771 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:12 crc kubenswrapper[4772]: W0930 17:12:12.292814 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1cf059_248e_4747_b217_5ef7be24858c.slice/crio-d89e9c90ca2d8334ff2e2d104c03deb84f17cf4d444570b0cc510f30bbb73c97 WatchSource:0}: Error finding container d89e9c90ca2d8334ff2e2d104c03deb84f17cf4d444570b0cc510f30bbb73c97: Status 404 returned error can't find the container with id d89e9c90ca2d8334ff2e2d104c03deb84f17cf4d444570b0cc510f30bbb73c97 Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.510356 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" event={"ID":"0d1cf059-248e-4747-b217-5ef7be24858c","Type":"ContainerStarted","Data":"d89e9c90ca2d8334ff2e2d104c03deb84f17cf4d444570b0cc510f30bbb73c97"} Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.511973 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7br52_5e5b90d4-3f5e-49d8-b2c5-175948eeeda6/kube-multus/2.log" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.515453 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovn-acl-logging/0.log" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.515833 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bj99l_47daa5db-853e-45af-98ae-489980c97641/ovn-controller/0.log" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.516142 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" event={"ID":"47daa5db-853e-45af-98ae-489980c97641","Type":"ContainerDied","Data":"47465566be7e0751b90cd0d519586f43f7758fc8174172bd466eea0feebee39b"} Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.516179 4772 scope.go:117] "RemoveContainer" containerID="8349cf0ac4454fe23d9f83ac717bce1f5de2645c6ceda50c1052f259339b3be3" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.516311 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bj99l" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.534305 4772 scope.go:117] "RemoveContainer" containerID="485f03ed7f186aadf0dea3d6160fc41861c954cc81a0ffdf7b56f37de3872af0" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.553154 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bj99l"] Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.555961 4772 scope.go:117] "RemoveContainer" containerID="c10a299898d91353321e173ee275b7a37e5534b0d0299d464b8ba6d265c742b8" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.559052 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bj99l"] Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.572094 4772 scope.go:117] "RemoveContainer" containerID="dbf43c547173663014b2f60c5b86a429b88880986078a3eb4d911fdfe9ee9b8f" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.591665 4772 scope.go:117] "RemoveContainer" containerID="00a741a92f583d7a61859b278efcc9e7dc1b443b04af02e70c3070627e54929b" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.607137 4772 scope.go:117] "RemoveContainer" containerID="1e32eb364652adde19f8d06de04e409ed0f4ae5da94fc8b9cb170826fc005c70" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.619781 4772 scope.go:117] "RemoveContainer" containerID="95b00075aa13f3b74180826685d550d6d0caed8f5711e6be2c01a47f06f89b63" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.633610 4772 scope.go:117] "RemoveContainer" containerID="e61239cd727dd9f94edd2b5b2344edf209b73911bbfc6826e598217ea91e74a9" Sep 30 17:12:12 crc kubenswrapper[4772]: I0930 17:12:12.651933 4772 scope.go:117] "RemoveContainer" containerID="54c0edfd356b54fe88ffcae08c28df90190cffc36e17b82b6059994a3cfc7d8c" Sep 30 17:12:13 crc kubenswrapper[4772]: I0930 17:12:13.524115 4772 generic.go:334] "Generic (PLEG): container finished" podID="0d1cf059-248e-4747-b217-5ef7be24858c" containerID="b55684bdde6bf8b73e9ea2cb6a449323cb8691054efb16cd6f48a78f6ee7d908" exitCode=0 Sep 30 17:12:13 crc kubenswrapper[4772]: I0930 17:12:13.524228 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" event={"ID":"0d1cf059-248e-4747-b217-5ef7be24858c","Type":"ContainerDied","Data":"b55684bdde6bf8b73e9ea2cb6a449323cb8691054efb16cd6f48a78f6ee7d908"} Sep 30 17:12:13 crc kubenswrapper[4772]: I0930 17:12:13.903925 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47daa5db-853e-45af-98ae-489980c97641" path="/var/lib/kubelet/pods/47daa5db-853e-45af-98ae-489980c97641/volumes" Sep 30 17:12:14 crc kubenswrapper[4772]: I0930 17:12:14.537254 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" event={"ID":"0d1cf059-248e-4747-b217-5ef7be24858c","Type":"ContainerStarted","Data":"870dbe4beb063275f23871e37954b1a685705c636bd14803c4e7289f1cd6ba21"} Sep 30 17:12:14 crc kubenswrapper[4772]: I0930 17:12:14.538410 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" event={"ID":"0d1cf059-248e-4747-b217-5ef7be24858c","Type":"ContainerStarted","Data":"dc19a6143e6fc6d16aff1ec68d755e30639013185fd8d9505dbeb18dbfcde5df"} Sep 30 17:12:15 crc kubenswrapper[4772]: I0930 17:12:15.547256 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" event={"ID":"0d1cf059-248e-4747-b217-5ef7be24858c","Type":"ContainerStarted","Data":"e2ca96cb8ae1ba96499d347a00c1c6fdfd646dd61c2a9caa46fc5704e901a8d2"} Sep 30 17:12:15 crc kubenswrapper[4772]: I0930 17:12:15.547564 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" event={"ID":"0d1cf059-248e-4747-b217-5ef7be24858c","Type":"ContainerStarted","Data":"d7c14a38cc12a43cf7cf34c9a04bd18f4955318c3c9cdba3e214cef99e79d145"} Sep 30 17:12:15 crc kubenswrapper[4772]: I0930 17:12:15.547576 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" event={"ID":"0d1cf059-248e-4747-b217-5ef7be24858c","Type":"ContainerStarted","Data":"f13c8a10f9e76bfd755e985db74b612c5afdd56a5f2a8f46159a2cf20a8b5d1d"} Sep 30 17:12:16 crc kubenswrapper[4772]: I0930 17:12:16.041429 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-lkvcc" Sep 30 17:12:16 crc kubenswrapper[4772]: I0930 17:12:16.556969 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" event={"ID":"0d1cf059-248e-4747-b217-5ef7be24858c","Type":"ContainerStarted","Data":"3a58b3af4efe98eef9226f87dc0c74ace854e014f91888942f0d488f2c696db9"} Sep 30 17:12:19 crc kubenswrapper[4772]: I0930 17:12:19.577216 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" event={"ID":"0d1cf059-248e-4747-b217-5ef7be24858c","Type":"ContainerStarted","Data":"912d577982116e68e7250caf9b6e65c20659dfe4c2272ba09e78e42399c57f19"} Sep 30 17:12:21 crc kubenswrapper[4772]: I0930 17:12:21.595559 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" event={"ID":"0d1cf059-248e-4747-b217-5ef7be24858c","Type":"ContainerStarted","Data":"57cdc195ae7aae9b346b8f3f4b0f6a471ff44ec4d0eb02319d22f744d162dae8"} Sep 30 17:12:21 crc kubenswrapper[4772]: I0930 17:12:21.596912 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:21 crc kubenswrapper[4772]: I0930 17:12:21.596935 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:21 crc kubenswrapper[4772]: I0930 17:12:21.596953 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:21 crc kubenswrapper[4772]: I0930 17:12:21.632392 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:21 crc kubenswrapper[4772]: I0930 17:12:21.638637 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:21 crc kubenswrapper[4772]: I0930 17:12:21.674133 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" podStartSLOduration=10.674102009 podStartE2EDuration="10.674102009s" podCreationTimestamp="2025-09-30 17:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:12:21.629085802 +0000 UTC m=+642.536098653" watchObservedRunningTime="2025-09-30 17:12:21.674102009 +0000 UTC m=+642.581114850" Sep 30 17:12:25 crc kubenswrapper[4772]: I0930 17:12:25.898270 4772 scope.go:117] "RemoveContainer" containerID="dd0542a8a6e1f74fa0a0bfda28a793973346f624d7bfe562855a5502e5c9ce83" Sep 30 17:12:25 crc kubenswrapper[4772]: E0930 17:12:25.899348 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7br52_openshift-multus(5e5b90d4-3f5e-49d8-b2c5-175948eeeda6)\"" pod="openshift-multus/multus-7br52" podUID="5e5b90d4-3f5e-49d8-b2c5-175948eeeda6" Sep 30 17:12:39 crc kubenswrapper[4772]: I0930 17:12:39.899842 4772 scope.go:117] "RemoveContainer" containerID="dd0542a8a6e1f74fa0a0bfda28a793973346f624d7bfe562855a5502e5c9ce83" Sep 30 17:12:40 crc kubenswrapper[4772]: I0930 17:12:40.703978 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7br52_5e5b90d4-3f5e-49d8-b2c5-175948eeeda6/kube-multus/2.log" Sep 30 17:12:40 crc kubenswrapper[4772]: I0930 17:12:40.704538 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7br52" event={"ID":"5e5b90d4-3f5e-49d8-b2c5-175948eeeda6","Type":"ContainerStarted","Data":"21ddb98b6e9ccb7f0e4dcbae3a64d6e2c53f1f4442c55dd3d4a2871854d98fec"} Sep 30 17:12:42 crc kubenswrapper[4772]: I0930 17:12:42.285751 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dft4s" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.606461 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh"] Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.608282 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.610487 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.624310 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh"] Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.638974 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.639327 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.639355 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85bng\" (UniqueName: \"kubernetes.io/projected/88994732-fb76-4b3b-aff1-9f27baea5f53-kube-api-access-85bng\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.740406 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.740491 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85bng\" (UniqueName: \"kubernetes.io/projected/88994732-fb76-4b3b-aff1-9f27baea5f53-kube-api-access-85bng\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.740636 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.740921 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.741321 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.759356 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85bng\" (UniqueName: \"kubernetes.io/projected/88994732-fb76-4b3b-aff1-9f27baea5f53-kube-api-access-85bng\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:51 crc kubenswrapper[4772]: I0930 17:12:51.930098 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:52 crc kubenswrapper[4772]: I0930 17:12:52.124803 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh"] Sep 30 17:12:52 crc kubenswrapper[4772]: W0930 17:12:52.127021 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88994732_fb76_4b3b_aff1_9f27baea5f53.slice/crio-22305b8d7bfc52d5fe41ad80f0001b518bf00b9855a575b477122855658ad1c2 WatchSource:0}: Error finding container 22305b8d7bfc52d5fe41ad80f0001b518bf00b9855a575b477122855658ad1c2: Status 404 returned error can't find the container with id 22305b8d7bfc52d5fe41ad80f0001b518bf00b9855a575b477122855658ad1c2 Sep 30 17:12:52 crc kubenswrapper[4772]: I0930 17:12:52.780736 4772 generic.go:334] "Generic (PLEG): container finished" podID="88994732-fb76-4b3b-aff1-9f27baea5f53" containerID="7d8fadc1200ad8a4132fbdf9ae5a4fa4ad9ead59d5c86f563796633ce8a0e68b" exitCode=0 Sep 30 17:12:52 crc kubenswrapper[4772]: I0930 17:12:52.780785 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" event={"ID":"88994732-fb76-4b3b-aff1-9f27baea5f53","Type":"ContainerDied","Data":"7d8fadc1200ad8a4132fbdf9ae5a4fa4ad9ead59d5c86f563796633ce8a0e68b"} Sep 30 17:12:52 crc kubenswrapper[4772]: I0930 17:12:52.780831 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" event={"ID":"88994732-fb76-4b3b-aff1-9f27baea5f53","Type":"ContainerStarted","Data":"22305b8d7bfc52d5fe41ad80f0001b518bf00b9855a575b477122855658ad1c2"} Sep 30 17:12:54 crc kubenswrapper[4772]: I0930 17:12:54.798041 4772 generic.go:334] "Generic (PLEG): container finished" podID="88994732-fb76-4b3b-aff1-9f27baea5f53" containerID="8eb475e9f9203d627d36e9a9d2d2452c2ee712cc9bd0c51f5310e2ab71b48e56" exitCode=0 Sep 30 17:12:54 crc kubenswrapper[4772]: I0930 17:12:54.798210 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" event={"ID":"88994732-fb76-4b3b-aff1-9f27baea5f53","Type":"ContainerDied","Data":"8eb475e9f9203d627d36e9a9d2d2452c2ee712cc9bd0c51f5310e2ab71b48e56"} Sep 30 17:12:55 crc kubenswrapper[4772]: I0930 17:12:55.805150 4772 generic.go:334] "Generic (PLEG): container finished" podID="88994732-fb76-4b3b-aff1-9f27baea5f53" containerID="56dc02909499630a14873ab13f4dcac113ca9088981fe406564876f58a0b1483" exitCode=0 Sep 30 17:12:55 crc kubenswrapper[4772]: I0930 17:12:55.805190 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" event={"ID":"88994732-fb76-4b3b-aff1-9f27baea5f53","Type":"ContainerDied","Data":"56dc02909499630a14873ab13f4dcac113ca9088981fe406564876f58a0b1483"} Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.074298 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.216541 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-bundle\") pod \"88994732-fb76-4b3b-aff1-9f27baea5f53\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.216632 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85bng\" (UniqueName: \"kubernetes.io/projected/88994732-fb76-4b3b-aff1-9f27baea5f53-kube-api-access-85bng\") pod \"88994732-fb76-4b3b-aff1-9f27baea5f53\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.216863 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-util\") pod \"88994732-fb76-4b3b-aff1-9f27baea5f53\" (UID: \"88994732-fb76-4b3b-aff1-9f27baea5f53\") " Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.219836 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-bundle" (OuterVolumeSpecName: "bundle") pod "88994732-fb76-4b3b-aff1-9f27baea5f53" (UID: "88994732-fb76-4b3b-aff1-9f27baea5f53"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.225403 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88994732-fb76-4b3b-aff1-9f27baea5f53-kube-api-access-85bng" (OuterVolumeSpecName: "kube-api-access-85bng") pod "88994732-fb76-4b3b-aff1-9f27baea5f53" (UID: "88994732-fb76-4b3b-aff1-9f27baea5f53"). InnerVolumeSpecName "kube-api-access-85bng". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.233992 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-util" (OuterVolumeSpecName: "util") pod "88994732-fb76-4b3b-aff1-9f27baea5f53" (UID: "88994732-fb76-4b3b-aff1-9f27baea5f53"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.318750 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.318811 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85bng\" (UniqueName: \"kubernetes.io/projected/88994732-fb76-4b3b-aff1-9f27baea5f53-kube-api-access-85bng\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.318826 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88994732-fb76-4b3b-aff1-9f27baea5f53-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.820978 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" event={"ID":"88994732-fb76-4b3b-aff1-9f27baea5f53","Type":"ContainerDied","Data":"22305b8d7bfc52d5fe41ad80f0001b518bf00b9855a575b477122855658ad1c2"} Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.821379 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22305b8d7bfc52d5fe41ad80f0001b518bf00b9855a575b477122855658ad1c2" Sep 30 17:12:57 crc kubenswrapper[4772]: I0930 17:12:57.821073 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.193780 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk"] Sep 30 17:13:05 crc kubenswrapper[4772]: E0930 17:13:05.194851 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88994732-fb76-4b3b-aff1-9f27baea5f53" containerName="util" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.194867 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="88994732-fb76-4b3b-aff1-9f27baea5f53" containerName="util" Sep 30 17:13:05 crc kubenswrapper[4772]: E0930 17:13:05.194879 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88994732-fb76-4b3b-aff1-9f27baea5f53" containerName="pull" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.194885 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="88994732-fb76-4b3b-aff1-9f27baea5f53" containerName="pull" Sep 30 17:13:05 crc kubenswrapper[4772]: E0930 17:13:05.194906 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88994732-fb76-4b3b-aff1-9f27baea5f53" containerName="extract" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.194913 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="88994732-fb76-4b3b-aff1-9f27baea5f53" containerName="extract" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.195010 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="88994732-fb76-4b3b-aff1-9f27baea5f53" containerName="extract" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.195524 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.198878 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.199040 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.203881 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-zcrzz" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.206772 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk"] Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.313690 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5"] Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.314532 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" Sep 30 17:13:05 crc kubenswrapper[4772]: W0930 17:13:05.318877 4772 reflector.go:561] object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-zlcrb": failed to list *v1.Secret: secrets "obo-prometheus-operator-admission-webhook-dockercfg-zlcrb" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Sep 30 17:13:05 crc kubenswrapper[4772]: E0930 17:13:05.318933 4772 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-dockercfg-zlcrb\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"obo-prometheus-operator-admission-webhook-dockercfg-zlcrb\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:13:05 crc kubenswrapper[4772]: W0930 17:13:05.319560 4772 reflector.go:561] object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert": failed to list *v1.Secret: secrets "obo-prometheus-operator-admission-webhook-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Sep 30 17:13:05 crc kubenswrapper[4772]: E0930 17:13:05.319581 4772 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"obo-prometheus-operator-admission-webhook-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.325855 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn"] Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.326749 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.340134 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb9st\" (UniqueName: \"kubernetes.io/projected/3cb2995b-6088-4762-8e3b-d99d0eaf03ed-kube-api-access-mb9st\") pod \"obo-prometheus-operator-7c8cf85677-qkggk\" (UID: \"3cb2995b-6088-4762-8e3b-d99d0eaf03ed\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.344319 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn"] Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.361808 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5"] Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.440903 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb9st\" (UniqueName: \"kubernetes.io/projected/3cb2995b-6088-4762-8e3b-d99d0eaf03ed-kube-api-access-mb9st\") pod \"obo-prometheus-operator-7c8cf85677-qkggk\" (UID: \"3cb2995b-6088-4762-8e3b-d99d0eaf03ed\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.440974 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/167ceeed-fcd1-409a-b655-f17da9529300-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-drcw5\" (UID: \"167ceeed-fcd1-409a-b655-f17da9529300\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.440998 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9acc0016-89fe-4a76-a443-b19b593dc666-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn\" (UID: \"9acc0016-89fe-4a76-a443-b19b593dc666\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.441038 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/167ceeed-fcd1-409a-b655-f17da9529300-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-drcw5\" (UID: \"167ceeed-fcd1-409a-b655-f17da9529300\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.441068 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9acc0016-89fe-4a76-a443-b19b593dc666-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn\" (UID: \"9acc0016-89fe-4a76-a443-b19b593dc666\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.473825 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb9st\" (UniqueName: \"kubernetes.io/projected/3cb2995b-6088-4762-8e3b-d99d0eaf03ed-kube-api-access-mb9st\") pod \"obo-prometheus-operator-7c8cf85677-qkggk\" (UID: \"3cb2995b-6088-4762-8e3b-d99d0eaf03ed\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.514415 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.541754 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/167ceeed-fcd1-409a-b655-f17da9529300-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-drcw5\" (UID: \"167ceeed-fcd1-409a-b655-f17da9529300\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.541797 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9acc0016-89fe-4a76-a443-b19b593dc666-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn\" (UID: \"9acc0016-89fe-4a76-a443-b19b593dc666\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.541857 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/167ceeed-fcd1-409a-b655-f17da9529300-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-drcw5\" (UID: \"167ceeed-fcd1-409a-b655-f17da9529300\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.541874 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9acc0016-89fe-4a76-a443-b19b593dc666-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn\" (UID: \"9acc0016-89fe-4a76-a443-b19b593dc666\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.551680 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-zm69t"] Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.552378 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.554854 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.556895 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-d66sf" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.571239 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-zm69t"] Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.642931 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ce80066-009d-4bb9-8a33-dcb521b0e08c-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-zm69t\" (UID: \"4ce80066-009d-4bb9-8a33-dcb521b0e08c\") " pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.642994 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktcmt\" (UniqueName: \"kubernetes.io/projected/4ce80066-009d-4bb9-8a33-dcb521b0e08c-kube-api-access-ktcmt\") pod \"observability-operator-cc5f78dfc-zm69t\" (UID: \"4ce80066-009d-4bb9-8a33-dcb521b0e08c\") " pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.747974 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ce80066-009d-4bb9-8a33-dcb521b0e08c-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-zm69t\" (UID: \"4ce80066-009d-4bb9-8a33-dcb521b0e08c\") " pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.748049 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktcmt\" (UniqueName: \"kubernetes.io/projected/4ce80066-009d-4bb9-8a33-dcb521b0e08c-kube-api-access-ktcmt\") pod \"observability-operator-cc5f78dfc-zm69t\" (UID: \"4ce80066-009d-4bb9-8a33-dcb521b0e08c\") " pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.750764 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-b9gwr"] Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.751451 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.760264 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ce80066-009d-4bb9-8a33-dcb521b0e08c-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-zm69t\" (UID: \"4ce80066-009d-4bb9-8a33-dcb521b0e08c\") " pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.767871 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-7lpbj" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.776776 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-b9gwr"] Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.784555 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktcmt\" (UniqueName: \"kubernetes.io/projected/4ce80066-009d-4bb9-8a33-dcb521b0e08c-kube-api-access-ktcmt\") pod \"observability-operator-cc5f78dfc-zm69t\" (UID: \"4ce80066-009d-4bb9-8a33-dcb521b0e08c\") " pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.839585 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk"] Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.849562 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/28b404db-1018-43c7-bdba-e2b0d97e1a8c-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-b9gwr\" (UID: \"28b404db-1018-43c7-bdba-e2b0d97e1a8c\") " pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.849606 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7mrv\" (UniqueName: \"kubernetes.io/projected/28b404db-1018-43c7-bdba-e2b0d97e1a8c-kube-api-access-t7mrv\") pod \"perses-operator-54bc95c9fb-b9gwr\" (UID: \"28b404db-1018-43c7-bdba-e2b0d97e1a8c\") " pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.880084 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.885968 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk" event={"ID":"3cb2995b-6088-4762-8e3b-d99d0eaf03ed","Type":"ContainerStarted","Data":"01e365515388e54e2d6a379c28f520a10d087e4c749b97bb922494dddf833caf"} Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.950687 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/28b404db-1018-43c7-bdba-e2b0d97e1a8c-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-b9gwr\" (UID: \"28b404db-1018-43c7-bdba-e2b0d97e1a8c\") " pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.950956 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7mrv\" (UniqueName: \"kubernetes.io/projected/28b404db-1018-43c7-bdba-e2b0d97e1a8c-kube-api-access-t7mrv\") pod \"perses-operator-54bc95c9fb-b9gwr\" (UID: \"28b404db-1018-43c7-bdba-e2b0d97e1a8c\") " pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.951741 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/28b404db-1018-43c7-bdba-e2b0d97e1a8c-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-b9gwr\" (UID: \"28b404db-1018-43c7-bdba-e2b0d97e1a8c\") " pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" Sep 30 17:13:05 crc kubenswrapper[4772]: I0930 17:13:05.971854 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7mrv\" (UniqueName: \"kubernetes.io/projected/28b404db-1018-43c7-bdba-e2b0d97e1a8c-kube-api-access-t7mrv\") pod \"perses-operator-54bc95c9fb-b9gwr\" (UID: \"28b404db-1018-43c7-bdba-e2b0d97e1a8c\") " pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.134281 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.195479 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-zm69t"] Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.298559 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-zlcrb" Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.356774 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.366324 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/167ceeed-fcd1-409a-b655-f17da9529300-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-drcw5\" (UID: \"167ceeed-fcd1-409a-b655-f17da9529300\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.366655 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/167ceeed-fcd1-409a-b655-f17da9529300-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-drcw5\" (UID: \"167ceeed-fcd1-409a-b655-f17da9529300\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.367120 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9acc0016-89fe-4a76-a443-b19b593dc666-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn\" (UID: \"9acc0016-89fe-4a76-a443-b19b593dc666\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.367557 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9acc0016-89fe-4a76-a443-b19b593dc666-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn\" (UID: \"9acc0016-89fe-4a76-a443-b19b593dc666\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.374930 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-b9gwr"] Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.528374 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.546019 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.797835 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5"] Sep 30 17:13:06 crc kubenswrapper[4772]: W0930 17:13:06.821257 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod167ceeed_fcd1_409a_b655_f17da9529300.slice/crio-436dea9f8959f77038356e62ecd314e26bbb902c0f88ffb0d8c6b88477b5102d WatchSource:0}: Error finding container 436dea9f8959f77038356e62ecd314e26bbb902c0f88ffb0d8c6b88477b5102d: Status 404 returned error can't find the container with id 436dea9f8959f77038356e62ecd314e26bbb902c0f88ffb0d8c6b88477b5102d Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.829567 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn"] Sep 30 17:13:06 crc kubenswrapper[4772]: W0930 17:13:06.835690 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9acc0016_89fe_4a76_a443_b19b593dc666.slice/crio-a9486aa9056debf485c6e7acc4f271c1a199493f409136a81425170dab969d7e WatchSource:0}: Error finding container a9486aa9056debf485c6e7acc4f271c1a199493f409136a81425170dab969d7e: Status 404 returned error can't find the container with id a9486aa9056debf485c6e7acc4f271c1a199493f409136a81425170dab969d7e Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.894290 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" event={"ID":"4ce80066-009d-4bb9-8a33-dcb521b0e08c","Type":"ContainerStarted","Data":"22233197b9a6b542eb93a1f835c11ab251950b6e9855c56528961184be3997a2"} Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.896627 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" event={"ID":"167ceeed-fcd1-409a-b655-f17da9529300","Type":"ContainerStarted","Data":"436dea9f8959f77038356e62ecd314e26bbb902c0f88ffb0d8c6b88477b5102d"} Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.899824 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" event={"ID":"28b404db-1018-43c7-bdba-e2b0d97e1a8c","Type":"ContainerStarted","Data":"1346262628a7ffce10d4fd37bc856c64e1080aa8efe5b7a2862e50883c7032ef"} Sep 30 17:13:06 crc kubenswrapper[4772]: I0930 17:13:06.902006 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" event={"ID":"9acc0016-89fe-4a76-a443-b19b593dc666","Type":"ContainerStarted","Data":"a9486aa9056debf485c6e7acc4f271c1a199493f409136a81425170dab969d7e"} Sep 30 17:13:30 crc kubenswrapper[4772]: E0930 17:13:30.208821 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:27ffe36aad6e606e6d0a211f48f3cdb58a53aa0d5e8ead6a444427231261ab9e" Sep 30 17:13:30 crc kubenswrapper[4772]: E0930 17:13:30.209809 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:27ffe36aad6e606e6d0a211f48f3cdb58a53aa0d5e8ead6a444427231261ab9e,Command:[],Args:[--namespace=$(NAMESPACE) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=perses=$(RELATED_IMAGE_PERSES) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:4d25b0e31549d780928d2dd3eed7defd9c6d460deb92dcff0fe72c5023029404,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:f3806c97420ec8ba91895ce7627df7612cccb927c05d7854377f45cdd6c924a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-0-50-rhel9@sha256:4b5e53d226733237fc5abd0476eb3c96162cf3d8da7aeba8deda631fa8987223,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-0-4-rhel9@sha256:53125bddbefca2ba2b57c3fd74bd4b376da803e420201220548878f557bd6610,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-1-0-rhel9@sha256:1dbe9a684271e00c8f36d8b96c9b22f6ee3c6f907ea6ad20980901bd533f9a3a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-0-4-rhel9@sha256:6aafab2c90bcbc6702f2d63d585a764baa8de8207e6af7afa60f3976ddfa9bd3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-0-3-rhel9@sha256:9f80851e8137c2c5e5c2aee13fc663f6c7124d9524d88c06c1507748ce84e1ed,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-6-1-rhel9@sha256:2c9b2be12f15f06a24393dbab6a31682cee399d42e2cc04b0dcf03b2b598d5cf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-6-0-rhel9@sha256:e9042d93f624790c450724158a8323277e4dd136530c763fec8db31f51fd8552,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-0-4-rhel9@sha256:456d45001816b9adc38745e0ad8705bdc0150d03d0f65e0dfa9caf3fb8980fad,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-0-5-rhel9@sha256:f3446969c67c18b44bee38ac946091fe9397a2117cb5b7aacb39406461c1efe1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-0-4-rhel9@sha256:ade84f8be7d23bd4b9c80e07462dc947280f0bcf6071e6edd927fef54c254b7e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:039e139cf9217bbe72248674df76cbe4baf4bef9f8dc367d2cb51eae9c4aa9d7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:142180f277f0221ef2d4176f9af6dcdb4e7ab434a68f0dfad2ee5bee0e667ddd,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ktcmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-cc5f78dfc-zm69t_openshift-operators(4ce80066-009d-4bb9-8a33-dcb521b0e08c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:13:30 crc kubenswrapper[4772]: E0930 17:13:30.211162 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" podUID="4ce80066-009d-4bb9-8a33-dcb521b0e08c" Sep 30 17:13:30 crc kubenswrapper[4772]: E0930 17:13:30.991541 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c" Sep 30 17:13:30 crc kubenswrapper[4772]: E0930 17:13:30.991729 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7mrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-54bc95c9fb-b9gwr_openshift-operators(28b404db-1018-43c7-bdba-e2b0d97e1a8c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:13:30 crc kubenswrapper[4772]: E0930 17:13:30.992874 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" podUID="28b404db-1018-43c7-bdba-e2b0d97e1a8c" Sep 30 17:13:31 crc kubenswrapper[4772]: E0930 17:13:31.107355 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d" Sep 30 17:13:31 crc kubenswrapper[4772]: E0930 17:13:31.107545 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn_openshift-operators(9acc0016-89fe-4a76-a443-b19b593dc666): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:13:31 crc kubenswrapper[4772]: E0930 17:13:31.109584 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" podUID="9acc0016-89fe-4a76-a443-b19b593dc666" Sep 30 17:13:31 crc kubenswrapper[4772]: E0930 17:13:31.115539 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c\\\"\"" pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" podUID="28b404db-1018-43c7-bdba-e2b0d97e1a8c" Sep 30 17:13:31 crc kubenswrapper[4772]: E0930 17:13:31.116420 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:27ffe36aad6e606e6d0a211f48f3cdb58a53aa0d5e8ead6a444427231261ab9e\\\"\"" pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" podUID="4ce80066-009d-4bb9-8a33-dcb521b0e08c" Sep 30 17:13:32 crc kubenswrapper[4772]: I0930 17:13:32.121745 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" event={"ID":"167ceeed-fcd1-409a-b655-f17da9529300","Type":"ContainerStarted","Data":"72411dddf931db7b79abfc116b5ab2018de33aec157a4e4f4ba55fda1ee92185"} Sep 30 17:13:32 crc kubenswrapper[4772]: I0930 17:13:32.124732 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk" event={"ID":"3cb2995b-6088-4762-8e3b-d99d0eaf03ed","Type":"ContainerStarted","Data":"d9a39c00b5506266715792cdd7f894a08f6ca5a82b06b8c925aafcc3773a8458"} Sep 30 17:13:32 crc kubenswrapper[4772]: I0930 17:13:32.211012 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-drcw5" podStartSLOduration=3.041996176 podStartE2EDuration="27.210979045s" podCreationTimestamp="2025-09-30 17:13:05 +0000 UTC" firstStartedPulling="2025-09-30 17:13:06.823967104 +0000 UTC m=+687.730979935" lastFinishedPulling="2025-09-30 17:13:30.992949963 +0000 UTC m=+711.899962804" observedRunningTime="2025-09-30 17:13:32.148383224 +0000 UTC m=+713.055396065" watchObservedRunningTime="2025-09-30 17:13:32.210979045 +0000 UTC m=+713.117991876" Sep 30 17:13:32 crc kubenswrapper[4772]: I0930 17:13:32.226373 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qkggk" podStartSLOduration=2.105610333 podStartE2EDuration="27.226344765s" podCreationTimestamp="2025-09-30 17:13:05 +0000 UTC" firstStartedPulling="2025-09-30 17:13:05.856493272 +0000 UTC m=+686.763506103" lastFinishedPulling="2025-09-30 17:13:30.977227714 +0000 UTC m=+711.884240535" observedRunningTime="2025-09-30 17:13:32.221346625 +0000 UTC m=+713.128359466" watchObservedRunningTime="2025-09-30 17:13:32.226344765 +0000 UTC m=+713.133357596" Sep 30 17:13:33 crc kubenswrapper[4772]: I0930 17:13:33.131378 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" event={"ID":"9acc0016-89fe-4a76-a443-b19b593dc666","Type":"ContainerStarted","Data":"6af86edb4f9ef646d7f5d52dc679a85098c5104f2526ae96d07ffecc100ec2f4"} Sep 30 17:13:33 crc kubenswrapper[4772]: I0930 17:13:33.151769 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn" podStartSLOduration=-9223372008.70304 podStartE2EDuration="28.151736341s" podCreationTimestamp="2025-09-30 17:13:05 +0000 UTC" firstStartedPulling="2025-09-30 17:13:06.838723639 +0000 UTC m=+687.745736470" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:13:33.148476716 +0000 UTC m=+714.055489567" watchObservedRunningTime="2025-09-30 17:13:33.151736341 +0000 UTC m=+714.058749172" Sep 30 17:13:47 crc kubenswrapper[4772]: I0930 17:13:47.211718 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" event={"ID":"4ce80066-009d-4bb9-8a33-dcb521b0e08c","Type":"ContainerStarted","Data":"e97e2f8ae842bc700b72a40f2fd032d0f05024d34f6017fcccff02afd8544369"} Sep 30 17:13:47 crc kubenswrapper[4772]: I0930 17:13:47.213690 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" Sep 30 17:13:47 crc kubenswrapper[4772]: I0930 17:13:47.215497 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" event={"ID":"28b404db-1018-43c7-bdba-e2b0d97e1a8c","Type":"ContainerStarted","Data":"221f0f6be2a253d25ef29228d452e933bd5a76dfb56ad09cdd0655e6ad2a2724"} Sep 30 17:13:47 crc kubenswrapper[4772]: I0930 17:13:47.215979 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" Sep 30 17:13:47 crc kubenswrapper[4772]: I0930 17:13:47.216439 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" Sep 30 17:13:47 crc kubenswrapper[4772]: I0930 17:13:47.249511 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-zm69t" podStartSLOduration=1.8047304130000001 podStartE2EDuration="42.249487704s" podCreationTimestamp="2025-09-30 17:13:05 +0000 UTC" firstStartedPulling="2025-09-30 17:13:06.205022505 +0000 UTC m=+687.112035336" lastFinishedPulling="2025-09-30 17:13:46.649779786 +0000 UTC m=+727.556792627" observedRunningTime="2025-09-30 17:13:47.245605133 +0000 UTC m=+728.152617964" watchObservedRunningTime="2025-09-30 17:13:47.249487704 +0000 UTC m=+728.156500535" Sep 30 17:13:47 crc kubenswrapper[4772]: I0930 17:13:47.304159 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" podStartSLOduration=1.910095612 podStartE2EDuration="42.304130003s" podCreationTimestamp="2025-09-30 17:13:05 +0000 UTC" firstStartedPulling="2025-09-30 17:13:06.38942725 +0000 UTC m=+687.296440071" lastFinishedPulling="2025-09-30 17:13:46.783461631 +0000 UTC m=+727.690474462" observedRunningTime="2025-09-30 17:13:47.295503958 +0000 UTC m=+728.202516789" watchObservedRunningTime="2025-09-30 17:13:47.304130003 +0000 UTC m=+728.211142834" Sep 30 17:13:56 crc kubenswrapper[4772]: I0930 17:13:56.137152 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-b9gwr" Sep 30 17:14:08 crc kubenswrapper[4772]: I0930 17:14:08.655626 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:14:08 crc kubenswrapper[4772]: I0930 17:14:08.656300 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.610014 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p"] Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.611548 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.613421 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.626336 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p"] Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.651276 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkl94\" (UniqueName: \"kubernetes.io/projected/3311e11b-7e62-409e-95e9-88528c9bffbb-kube-api-access-xkl94\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.651357 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.651422 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.752784 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkl94\" (UniqueName: \"kubernetes.io/projected/3311e11b-7e62-409e-95e9-88528c9bffbb-kube-api-access-xkl94\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.752871 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.752932 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.753539 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.753568 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.775105 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkl94\" (UniqueName: \"kubernetes.io/projected/3311e11b-7e62-409e-95e9-88528c9bffbb-kube-api-access-xkl94\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:12 crc kubenswrapper[4772]: I0930 17:14:12.928135 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:13 crc kubenswrapper[4772]: I0930 17:14:13.358262 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p"] Sep 30 17:14:14 crc kubenswrapper[4772]: I0930 17:14:14.381891 4772 generic.go:334] "Generic (PLEG): container finished" podID="3311e11b-7e62-409e-95e9-88528c9bffbb" containerID="3f961f4727f09daa425eb2fcfbf2aee6da4e4957ef82a92c94455435cac94dc9" exitCode=0 Sep 30 17:14:14 crc kubenswrapper[4772]: I0930 17:14:14.381969 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" event={"ID":"3311e11b-7e62-409e-95e9-88528c9bffbb","Type":"ContainerDied","Data":"3f961f4727f09daa425eb2fcfbf2aee6da4e4957ef82a92c94455435cac94dc9"} Sep 30 17:14:14 crc kubenswrapper[4772]: I0930 17:14:14.382005 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" event={"ID":"3311e11b-7e62-409e-95e9-88528c9bffbb","Type":"ContainerStarted","Data":"b138c0558138a854d1c72cd921c3cd4d44f07405e42d5dc7008741016eb446fe"} Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.011746 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ghsks"] Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.012856 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" podUID="c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" containerName="controller-manager" containerID="cri-o://f4b6de6f3bc5cbdc1fba6901e139438a6de714f24ff9431bcb5b3e852b9fb6d0" gracePeriod=30 Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.095530 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm"] Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.095747 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" podUID="8187513e-1ddb-4a68-8a95-e5c5b1d2206a" containerName="route-controller-manager" containerID="cri-o://c3bd4890fc12a6785ad5e3df2e0f7b680510b554973819e14af739cac592187b" gracePeriod=30 Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.401963 4772 generic.go:334] "Generic (PLEG): container finished" podID="8187513e-1ddb-4a68-8a95-e5c5b1d2206a" containerID="c3bd4890fc12a6785ad5e3df2e0f7b680510b554973819e14af739cac592187b" exitCode=0 Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.402337 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" event={"ID":"8187513e-1ddb-4a68-8a95-e5c5b1d2206a","Type":"ContainerDied","Data":"c3bd4890fc12a6785ad5e3df2e0f7b680510b554973819e14af739cac592187b"} Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.406249 4772 generic.go:334] "Generic (PLEG): container finished" podID="c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" containerID="f4b6de6f3bc5cbdc1fba6901e139438a6de714f24ff9431bcb5b3e852b9fb6d0" exitCode=0 Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.406303 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" event={"ID":"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4","Type":"ContainerDied","Data":"f4b6de6f3bc5cbdc1fba6901e139438a6de714f24ff9431bcb5b3e852b9fb6d0"} Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.408905 4772 generic.go:334] "Generic (PLEG): container finished" podID="3311e11b-7e62-409e-95e9-88528c9bffbb" containerID="279937d92493ecd1b00f7e7d17a98dce76458af49ed4a39631d1f97ffec746a7" exitCode=0 Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.408933 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" event={"ID":"3311e11b-7e62-409e-95e9-88528c9bffbb","Type":"ContainerDied","Data":"279937d92493ecd1b00f7e7d17a98dce76458af49ed4a39631d1f97ffec746a7"} Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.450130 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.506237 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-config\") pod \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.506353 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-client-ca\") pod \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.506439 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-serving-cert\") pod \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.506487 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-proxy-ca-bundles\") pod \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.506511 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp7gg\" (UniqueName: \"kubernetes.io/projected/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-kube-api-access-dp7gg\") pod \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\" (UID: \"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4\") " Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.507441 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" (UID: "c10c3d38-5396-40cc-8f9b-69a2a9b61ad4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.508857 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-client-ca" (OuterVolumeSpecName: "client-ca") pod "c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" (UID: "c10c3d38-5396-40cc-8f9b-69a2a9b61ad4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.508894 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-config" (OuterVolumeSpecName: "config") pod "c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" (UID: "c10c3d38-5396-40cc-8f9b-69a2a9b61ad4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.520483 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-kube-api-access-dp7gg" (OuterVolumeSpecName: "kube-api-access-dp7gg") pod "c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" (UID: "c10c3d38-5396-40cc-8f9b-69a2a9b61ad4"). InnerVolumeSpecName "kube-api-access-dp7gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.523291 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" (UID: "c10c3d38-5396-40cc-8f9b-69a2a9b61ad4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.524923 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.609441 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-serving-cert\") pod \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.609601 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-config\") pod \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.609702 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-client-ca\") pod \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.609730 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x5tt\" (UniqueName: \"kubernetes.io/projected/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-kube-api-access-9x5tt\") pod \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\" (UID: \"8187513e-1ddb-4a68-8a95-e5c5b1d2206a\") " Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.610506 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-client-ca" (OuterVolumeSpecName: "client-ca") pod "8187513e-1ddb-4a68-8a95-e5c5b1d2206a" (UID: "8187513e-1ddb-4a68-8a95-e5c5b1d2206a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.610639 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-config" (OuterVolumeSpecName: "config") pod "8187513e-1ddb-4a68-8a95-e5c5b1d2206a" (UID: "8187513e-1ddb-4a68-8a95-e5c5b1d2206a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.610862 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.610884 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.610896 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.610908 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp7gg\" (UniqueName: \"kubernetes.io/projected/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-kube-api-access-dp7gg\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.610919 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.610930 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.610941 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.633732 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-kube-api-access-9x5tt" (OuterVolumeSpecName: "kube-api-access-9x5tt") pod "8187513e-1ddb-4a68-8a95-e5c5b1d2206a" (UID: "8187513e-1ddb-4a68-8a95-e5c5b1d2206a"). InnerVolumeSpecName "kube-api-access-9x5tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.634322 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8187513e-1ddb-4a68-8a95-e5c5b1d2206a" (UID: "8187513e-1ddb-4a68-8a95-e5c5b1d2206a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.712450 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:16 crc kubenswrapper[4772]: I0930 17:14:16.712499 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x5tt\" (UniqueName: \"kubernetes.io/projected/8187513e-1ddb-4a68-8a95-e5c5b1d2206a-kube-api-access-9x5tt\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.418919 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" event={"ID":"8187513e-1ddb-4a68-8a95-e5c5b1d2206a","Type":"ContainerDied","Data":"02d3879ae09992eaea5afdf29f10c28fbfa5c34031fe67691ea19209a13a3d14"} Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.418991 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm" Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.418997 4772 scope.go:117] "RemoveContainer" containerID="c3bd4890fc12a6785ad5e3df2e0f7b680510b554973819e14af739cac592187b" Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.420562 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" event={"ID":"c10c3d38-5396-40cc-8f9b-69a2a9b61ad4","Type":"ContainerDied","Data":"e4e91750330371b554fb51e47020b4e7875c1dbeb65752cc3ea32046f22cda46"} Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.420621 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ghsks" Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.429116 4772 generic.go:334] "Generic (PLEG): container finished" podID="3311e11b-7e62-409e-95e9-88528c9bffbb" containerID="ce3e75ccb0b481f977ddf45b020959de03313092825ac46e9d1affef39d3d499" exitCode=0 Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.429158 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" event={"ID":"3311e11b-7e62-409e-95e9-88528c9bffbb","Type":"ContainerDied","Data":"ce3e75ccb0b481f977ddf45b020959de03313092825ac46e9d1affef39d3d499"} Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.445726 4772 scope.go:117] "RemoveContainer" containerID="f4b6de6f3bc5cbdc1fba6901e139438a6de714f24ff9431bcb5b3e852b9fb6d0" Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.468617 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ghsks"] Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.473004 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ghsks"] Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.482591 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm"] Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.487154 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jwvjm"] Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.905654 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8187513e-1ddb-4a68-8a95-e5c5b1d2206a" path="/var/lib/kubelet/pods/8187513e-1ddb-4a68-8a95-e5c5b1d2206a/volumes" Sep 30 17:14:17 crc kubenswrapper[4772]: I0930 17:14:17.906328 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" path="/var/lib/kubelet/pods/c10c3d38-5396-40cc-8f9b-69a2a9b61ad4/volumes" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.002983 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl"] Sep 30 17:14:18 crc kubenswrapper[4772]: E0930 17:14:18.003260 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" containerName="controller-manager" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.003275 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" containerName="controller-manager" Sep 30 17:14:18 crc kubenswrapper[4772]: E0930 17:14:18.003292 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8187513e-1ddb-4a68-8a95-e5c5b1d2206a" containerName="route-controller-manager" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.003298 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8187513e-1ddb-4a68-8a95-e5c5b1d2206a" containerName="route-controller-manager" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.003409 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c10c3d38-5396-40cc-8f9b-69a2a9b61ad4" containerName="controller-manager" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.003426 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8187513e-1ddb-4a68-8a95-e5c5b1d2206a" containerName="route-controller-manager" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.003821 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.005954 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68f5887564-r7xl7"] Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.006559 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.007641 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.007665 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.007695 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.007951 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.008018 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.008139 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.010865 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.011162 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.011551 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.011573 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.011720 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.011828 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.016634 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.024635 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68f5887564-r7xl7"] Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.027964 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl"] Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.131400 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ca46d68-5fa2-4d99-92da-89a7784f3325-proxy-ca-bundles\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.131462 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkk8x\" (UniqueName: \"kubernetes.io/projected/1ca46d68-5fa2-4d99-92da-89a7784f3325-kube-api-access-gkk8x\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.131505 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ca46d68-5fa2-4d99-92da-89a7784f3325-config\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.131576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjx85\" (UniqueName: \"kubernetes.io/projected/c0a3f194-be9a-4431-9173-c73537a73108-kube-api-access-vjx85\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.131637 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0a3f194-be9a-4431-9173-c73537a73108-serving-cert\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.131665 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a3f194-be9a-4431-9173-c73537a73108-config\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.131783 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ca46d68-5fa2-4d99-92da-89a7784f3325-client-ca\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.131813 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ca46d68-5fa2-4d99-92da-89a7784f3325-serving-cert\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.131851 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0a3f194-be9a-4431-9173-c73537a73108-client-ca\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.233324 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0a3f194-be9a-4431-9173-c73537a73108-client-ca\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.233402 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ca46d68-5fa2-4d99-92da-89a7784f3325-proxy-ca-bundles\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.233438 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkk8x\" (UniqueName: \"kubernetes.io/projected/1ca46d68-5fa2-4d99-92da-89a7784f3325-kube-api-access-gkk8x\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.233902 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ca46d68-5fa2-4d99-92da-89a7784f3325-config\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.233970 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjx85\" (UniqueName: \"kubernetes.io/projected/c0a3f194-be9a-4431-9173-c73537a73108-kube-api-access-vjx85\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.234287 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0a3f194-be9a-4431-9173-c73537a73108-serving-cert\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.234426 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0a3f194-be9a-4431-9173-c73537a73108-client-ca\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.234736 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ca46d68-5fa2-4d99-92da-89a7784f3325-proxy-ca-bundles\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.234886 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ca46d68-5fa2-4d99-92da-89a7784f3325-config\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.234946 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a3f194-be9a-4431-9173-c73537a73108-config\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.235001 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ca46d68-5fa2-4d99-92da-89a7784f3325-client-ca\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.235029 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ca46d68-5fa2-4d99-92da-89a7784f3325-serving-cert\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.235785 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ca46d68-5fa2-4d99-92da-89a7784f3325-client-ca\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.236023 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a3f194-be9a-4431-9173-c73537a73108-config\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.241049 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ca46d68-5fa2-4d99-92da-89a7784f3325-serving-cert\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.241049 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0a3f194-be9a-4431-9173-c73537a73108-serving-cert\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.256698 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkk8x\" (UniqueName: \"kubernetes.io/projected/1ca46d68-5fa2-4d99-92da-89a7784f3325-kube-api-access-gkk8x\") pod \"controller-manager-68f5887564-r7xl7\" (UID: \"1ca46d68-5fa2-4d99-92da-89a7784f3325\") " pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.257668 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjx85\" (UniqueName: \"kubernetes.io/projected/c0a3f194-be9a-4431-9173-c73537a73108-kube-api-access-vjx85\") pod \"route-controller-manager-8bd47f7c-bdxwl\" (UID: \"c0a3f194-be9a-4431-9173-c73537a73108\") " pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.324925 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.338513 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.589752 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl"] Sep 30 17:14:18 crc kubenswrapper[4772]: W0930 17:14:18.597494 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0a3f194_be9a_4431_9173_c73537a73108.slice/crio-c6a89b139de20210d4b4374c34afb1e735e870621cf19bcd4d57b53775c66e0b WatchSource:0}: Error finding container c6a89b139de20210d4b4374c34afb1e735e870621cf19bcd4d57b53775c66e0b: Status 404 returned error can't find the container with id c6a89b139de20210d4b4374c34afb1e735e870621cf19bcd4d57b53775c66e0b Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.721365 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.844362 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-util\") pod \"3311e11b-7e62-409e-95e9-88528c9bffbb\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.844546 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkl94\" (UniqueName: \"kubernetes.io/projected/3311e11b-7e62-409e-95e9-88528c9bffbb-kube-api-access-xkl94\") pod \"3311e11b-7e62-409e-95e9-88528c9bffbb\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.844572 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-bundle\") pod \"3311e11b-7e62-409e-95e9-88528c9bffbb\" (UID: \"3311e11b-7e62-409e-95e9-88528c9bffbb\") " Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.845266 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-bundle" (OuterVolumeSpecName: "bundle") pod "3311e11b-7e62-409e-95e9-88528c9bffbb" (UID: "3311e11b-7e62-409e-95e9-88528c9bffbb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.853426 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3311e11b-7e62-409e-95e9-88528c9bffbb-kube-api-access-xkl94" (OuterVolumeSpecName: "kube-api-access-xkl94") pod "3311e11b-7e62-409e-95e9-88528c9bffbb" (UID: "3311e11b-7e62-409e-95e9-88528c9bffbb"). InnerVolumeSpecName "kube-api-access-xkl94". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.859943 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68f5887564-r7xl7"] Sep 30 17:14:18 crc kubenswrapper[4772]: W0930 17:14:18.867248 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca46d68_5fa2_4d99_92da_89a7784f3325.slice/crio-4846fcec3ba8fb8756c2893bd23ced9109670678395a9c442dd6f5b86429c521 WatchSource:0}: Error finding container 4846fcec3ba8fb8756c2893bd23ced9109670678395a9c442dd6f5b86429c521: Status 404 returned error can't find the container with id 4846fcec3ba8fb8756c2893bd23ced9109670678395a9c442dd6f5b86429c521 Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.946514 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkl94\" (UniqueName: \"kubernetes.io/projected/3311e11b-7e62-409e-95e9-88528c9bffbb-kube-api-access-xkl94\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.946550 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:18 crc kubenswrapper[4772]: I0930 17:14:18.954411 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-util" (OuterVolumeSpecName: "util") pod "3311e11b-7e62-409e-95e9-88528c9bffbb" (UID: "3311e11b-7e62-409e-95e9-88528c9bffbb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.047531 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3311e11b-7e62-409e-95e9-88528c9bffbb-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.452692 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" event={"ID":"3311e11b-7e62-409e-95e9-88528c9bffbb","Type":"ContainerDied","Data":"b138c0558138a854d1c72cd921c3cd4d44f07405e42d5dc7008741016eb446fe"} Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.453068 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b138c0558138a854d1c72cd921c3cd4d44f07405e42d5dc7008741016eb446fe" Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.452739 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p" Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.454304 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" event={"ID":"1ca46d68-5fa2-4d99-92da-89a7784f3325","Type":"ContainerStarted","Data":"924a8682c9e938cec073c99f1e821416383b5ad16a15d4964ecead4fcf399150"} Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.454335 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" event={"ID":"1ca46d68-5fa2-4d99-92da-89a7784f3325","Type":"ContainerStarted","Data":"4846fcec3ba8fb8756c2893bd23ced9109670678395a9c442dd6f5b86429c521"} Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.456492 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.458239 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" event={"ID":"c0a3f194-be9a-4431-9173-c73537a73108","Type":"ContainerStarted","Data":"1e7b91161c8249a988a25a36ec21df81830599e4d386d70c207958b528acb7ce"} Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.458266 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" event={"ID":"c0a3f194-be9a-4431-9173-c73537a73108","Type":"ContainerStarted","Data":"c6a89b139de20210d4b4374c34afb1e735e870621cf19bcd4d57b53775c66e0b"} Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.458704 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.459636 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.462849 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.471698 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68f5887564-r7xl7" podStartSLOduration=3.471678706 podStartE2EDuration="3.471678706s" podCreationTimestamp="2025-09-30 17:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:14:19.471240345 +0000 UTC m=+760.378253186" watchObservedRunningTime="2025-09-30 17:14:19.471678706 +0000 UTC m=+760.378691537" Sep 30 17:14:19 crc kubenswrapper[4772]: I0930 17:14:19.756859 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8bd47f7c-bdxwl" podStartSLOduration=3.756843432 podStartE2EDuration="3.756843432s" podCreationTimestamp="2025-09-30 17:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:14:19.530170986 +0000 UTC m=+760.437183827" watchObservedRunningTime="2025-09-30 17:14:19.756843432 +0000 UTC m=+760.663856263" Sep 30 17:14:23 crc kubenswrapper[4772]: I0930 17:14:23.835971 4772 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.069512 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72"] Sep 30 17:14:24 crc kubenswrapper[4772]: E0930 17:14:24.069780 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3311e11b-7e62-409e-95e9-88528c9bffbb" containerName="extract" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.069797 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3311e11b-7e62-409e-95e9-88528c9bffbb" containerName="extract" Sep 30 17:14:24 crc kubenswrapper[4772]: E0930 17:14:24.069813 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3311e11b-7e62-409e-95e9-88528c9bffbb" containerName="pull" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.069821 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3311e11b-7e62-409e-95e9-88528c9bffbb" containerName="pull" Sep 30 17:14:24 crc kubenswrapper[4772]: E0930 17:14:24.069838 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3311e11b-7e62-409e-95e9-88528c9bffbb" containerName="util" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.069847 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3311e11b-7e62-409e-95e9-88528c9bffbb" containerName="util" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.069961 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3311e11b-7e62-409e-95e9-88528c9bffbb" containerName="extract" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.070464 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.072442 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-8dclq" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.072638 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.073426 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.083306 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72"] Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.118816 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8psds\" (UniqueName: \"kubernetes.io/projected/67cdd39b-a0de-4d14-ba2f-2419b31983da-kube-api-access-8psds\") pod \"nmstate-operator-5d6f6cfd66-qds72\" (UID: \"67cdd39b-a0de-4d14-ba2f-2419b31983da\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.220626 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8psds\" (UniqueName: \"kubernetes.io/projected/67cdd39b-a0de-4d14-ba2f-2419b31983da-kube-api-access-8psds\") pod \"nmstate-operator-5d6f6cfd66-qds72\" (UID: \"67cdd39b-a0de-4d14-ba2f-2419b31983da\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.242734 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8psds\" (UniqueName: \"kubernetes.io/projected/67cdd39b-a0de-4d14-ba2f-2419b31983da-kube-api-access-8psds\") pod \"nmstate-operator-5d6f6cfd66-qds72\" (UID: \"67cdd39b-a0de-4d14-ba2f-2419b31983da\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.391606 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72" Sep 30 17:14:24 crc kubenswrapper[4772]: I0930 17:14:24.795971 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72"] Sep 30 17:14:25 crc kubenswrapper[4772]: I0930 17:14:25.494536 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72" event={"ID":"67cdd39b-a0de-4d14-ba2f-2419b31983da","Type":"ContainerStarted","Data":"cc613dc8d0d8a18dbea32a5cc4a92171026f3d4aca55bed29d4b6415c009a07c"} Sep 30 17:14:27 crc kubenswrapper[4772]: I0930 17:14:27.510899 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72" event={"ID":"67cdd39b-a0de-4d14-ba2f-2419b31983da","Type":"ContainerStarted","Data":"03c040e3901d43ad28ddd5dd837fbb48a916a75bc9df7da437eee098608fe35e"} Sep 30 17:14:27 crc kubenswrapper[4772]: I0930 17:14:27.528568 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qds72" podStartSLOduration=1.5680740260000001 podStartE2EDuration="3.52855291s" podCreationTimestamp="2025-09-30 17:14:24 +0000 UTC" firstStartedPulling="2025-09-30 17:14:24.808854908 +0000 UTC m=+765.715867739" lastFinishedPulling="2025-09-30 17:14:26.769333792 +0000 UTC m=+767.676346623" observedRunningTime="2025-09-30 17:14:27.527726568 +0000 UTC m=+768.434739419" watchObservedRunningTime="2025-09-30 17:14:27.52855291 +0000 UTC m=+768.435565741" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.573520 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wrxvs"] Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.575376 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.582577 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wrxvs"] Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.742689 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sckg4\" (UniqueName: \"kubernetes.io/projected/f288de1a-c111-4df2-adfc-a2725320ada8-kube-api-access-sckg4\") pod \"redhat-operators-wrxvs\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.742753 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-utilities\") pod \"redhat-operators-wrxvs\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.742781 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-catalog-content\") pod \"redhat-operators-wrxvs\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.843815 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sckg4\" (UniqueName: \"kubernetes.io/projected/f288de1a-c111-4df2-adfc-a2725320ada8-kube-api-access-sckg4\") pod \"redhat-operators-wrxvs\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.844178 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-utilities\") pod \"redhat-operators-wrxvs\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.844274 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-catalog-content\") pod \"redhat-operators-wrxvs\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.844819 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-utilities\") pod \"redhat-operators-wrxvs\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.844895 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-catalog-content\") pod \"redhat-operators-wrxvs\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.868025 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sckg4\" (UniqueName: \"kubernetes.io/projected/f288de1a-c111-4df2-adfc-a2725320ada8-kube-api-access-sckg4\") pod \"redhat-operators-wrxvs\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:32 crc kubenswrapper[4772]: I0930 17:14:32.894910 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.305073 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wrxvs"] Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.547680 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrxvs" event={"ID":"f288de1a-c111-4df2-adfc-a2725320ada8","Type":"ContainerStarted","Data":"dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14"} Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.548044 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrxvs" event={"ID":"f288de1a-c111-4df2-adfc-a2725320ada8","Type":"ContainerStarted","Data":"d3b7ab78e71dbf8fed88b2a9e0cd553cfbd0e61952d21128250e72676d9a3e7e"} Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.726455 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-sll7d"] Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.727583 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-sll7d" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.733794 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-jjfhk" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.739002 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-kphv7"] Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.739909 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.742384 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.764758 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-kphv7"] Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.775808 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-pf6qn"] Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.776516 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.800164 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-sll7d"] Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.862027 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4febaade-1298-413f-8f68-ca4771613783-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-kphv7\" (UID: \"4febaade-1298-413f-8f68-ca4771613783\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.862107 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whv4d\" (UniqueName: \"kubernetes.io/projected/4febaade-1298-413f-8f68-ca4771613783-kube-api-access-whv4d\") pod \"nmstate-webhook-6d689559c5-kphv7\" (UID: \"4febaade-1298-413f-8f68-ca4771613783\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.862199 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b678w\" (UniqueName: \"kubernetes.io/projected/aed9b88e-7f1b-472d-a22c-ebf719c71f73-kube-api-access-b678w\") pod \"nmstate-metrics-58fcddf996-sll7d\" (UID: \"aed9b88e-7f1b-472d-a22c-ebf719c71f73\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-sll7d" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.891198 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg"] Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.891938 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.894463 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.895521 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.901362 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6n4dp" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.906286 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg"] Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.963361 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/477c7640-d169-487f-a2d7-9164f8b26417-nmstate-lock\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.963427 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw6hd\" (UniqueName: \"kubernetes.io/projected/477c7640-d169-487f-a2d7-9164f8b26417-kube-api-access-qw6hd\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.963967 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b678w\" (UniqueName: \"kubernetes.io/projected/aed9b88e-7f1b-472d-a22c-ebf719c71f73-kube-api-access-b678w\") pod \"nmstate-metrics-58fcddf996-sll7d\" (UID: \"aed9b88e-7f1b-472d-a22c-ebf719c71f73\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-sll7d" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.964077 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/477c7640-d169-487f-a2d7-9164f8b26417-ovs-socket\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.964146 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4febaade-1298-413f-8f68-ca4771613783-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-kphv7\" (UID: \"4febaade-1298-413f-8f68-ca4771613783\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.964168 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whv4d\" (UniqueName: \"kubernetes.io/projected/4febaade-1298-413f-8f68-ca4771613783-kube-api-access-whv4d\") pod \"nmstate-webhook-6d689559c5-kphv7\" (UID: \"4febaade-1298-413f-8f68-ca4771613783\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.964224 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/477c7640-d169-487f-a2d7-9164f8b26417-dbus-socket\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:33 crc kubenswrapper[4772]: E0930 17:14:33.964326 4772 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 30 17:14:33 crc kubenswrapper[4772]: E0930 17:14:33.964419 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4febaade-1298-413f-8f68-ca4771613783-tls-key-pair podName:4febaade-1298-413f-8f68-ca4771613783 nodeName:}" failed. No retries permitted until 2025-09-30 17:14:34.464392345 +0000 UTC m=+775.371405286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/4febaade-1298-413f-8f68-ca4771613783-tls-key-pair") pod "nmstate-webhook-6d689559c5-kphv7" (UID: "4febaade-1298-413f-8f68-ca4771613783") : secret "openshift-nmstate-webhook" not found Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.993283 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whv4d\" (UniqueName: \"kubernetes.io/projected/4febaade-1298-413f-8f68-ca4771613783-kube-api-access-whv4d\") pod \"nmstate-webhook-6d689559c5-kphv7\" (UID: \"4febaade-1298-413f-8f68-ca4771613783\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:14:33 crc kubenswrapper[4772]: I0930 17:14:33.995387 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b678w\" (UniqueName: \"kubernetes.io/projected/aed9b88e-7f1b-472d-a22c-ebf719c71f73-kube-api-access-b678w\") pod \"nmstate-metrics-58fcddf996-sll7d\" (UID: \"aed9b88e-7f1b-472d-a22c-ebf719c71f73\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-sll7d" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.041674 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-sll7d" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.065128 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/477c7640-d169-487f-a2d7-9164f8b26417-ovs-socket\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.065288 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfgqs\" (UniqueName: \"kubernetes.io/projected/f076e40b-6b99-4a23-8235-c008e4a209c5-kube-api-access-kfgqs\") pod \"nmstate-console-plugin-864bb6dfb5-m6pgg\" (UID: \"f076e40b-6b99-4a23-8235-c008e4a209c5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.065372 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/477c7640-d169-487f-a2d7-9164f8b26417-ovs-socket\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.065507 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/477c7640-d169-487f-a2d7-9164f8b26417-dbus-socket\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.065596 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/477c7640-d169-487f-a2d7-9164f8b26417-nmstate-lock\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.065667 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw6hd\" (UniqueName: \"kubernetes.io/projected/477c7640-d169-487f-a2d7-9164f8b26417-kube-api-access-qw6hd\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.065741 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f076e40b-6b99-4a23-8235-c008e4a209c5-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-m6pgg\" (UID: \"f076e40b-6b99-4a23-8235-c008e4a209c5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.065814 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f076e40b-6b99-4a23-8235-c008e4a209c5-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-m6pgg\" (UID: \"f076e40b-6b99-4a23-8235-c008e4a209c5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.065773 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/477c7640-d169-487f-a2d7-9164f8b26417-nmstate-lock\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.065744 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/477c7640-d169-487f-a2d7-9164f8b26417-dbus-socket\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.098288 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw6hd\" (UniqueName: \"kubernetes.io/projected/477c7640-d169-487f-a2d7-9164f8b26417-kube-api-access-qw6hd\") pod \"nmstate-handler-pf6qn\" (UID: \"477c7640-d169-487f-a2d7-9164f8b26417\") " pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.099557 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:34 crc kubenswrapper[4772]: W0930 17:14:34.136231 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod477c7640_d169_487f_a2d7_9164f8b26417.slice/crio-0a99d3dee1384acf902e136e5b40855a84ec9581c708268f4254a1cf2d007e0e WatchSource:0}: Error finding container 0a99d3dee1384acf902e136e5b40855a84ec9581c708268f4254a1cf2d007e0e: Status 404 returned error can't find the container with id 0a99d3dee1384acf902e136e5b40855a84ec9581c708268f4254a1cf2d007e0e Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.138610 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5767d7b4df-5ddh2"] Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.139370 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.163430 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5767d7b4df-5ddh2"] Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.169957 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-service-ca\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.171291 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfgqs\" (UniqueName: \"kubernetes.io/projected/f076e40b-6b99-4a23-8235-c008e4a209c5-kube-api-access-kfgqs\") pod \"nmstate-console-plugin-864bb6dfb5-m6pgg\" (UID: \"f076e40b-6b99-4a23-8235-c008e4a209c5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.171431 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-trusted-ca-bundle\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.171508 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n878n\" (UniqueName: \"kubernetes.io/projected/ab416851-500d-430c-80ee-4b06181c03e0-kube-api-access-n878n\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.171576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab416851-500d-430c-80ee-4b06181c03e0-console-oauth-config\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.171671 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-console-config\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.171765 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-oauth-serving-cert\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.171842 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f076e40b-6b99-4a23-8235-c008e4a209c5-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-m6pgg\" (UID: \"f076e40b-6b99-4a23-8235-c008e4a209c5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.171910 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f076e40b-6b99-4a23-8235-c008e4a209c5-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-m6pgg\" (UID: \"f076e40b-6b99-4a23-8235-c008e4a209c5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.171980 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab416851-500d-430c-80ee-4b06181c03e0-console-serving-cert\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.173470 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f076e40b-6b99-4a23-8235-c008e4a209c5-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-m6pgg\" (UID: \"f076e40b-6b99-4a23-8235-c008e4a209c5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.181465 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f076e40b-6b99-4a23-8235-c008e4a209c5-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-m6pgg\" (UID: \"f076e40b-6b99-4a23-8235-c008e4a209c5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.192647 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfgqs\" (UniqueName: \"kubernetes.io/projected/f076e40b-6b99-4a23-8235-c008e4a209c5-kube-api-access-kfgqs\") pod \"nmstate-console-plugin-864bb6dfb5-m6pgg\" (UID: \"f076e40b-6b99-4a23-8235-c008e4a209c5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.221126 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.276770 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-trusted-ca-bundle\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.276843 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n878n\" (UniqueName: \"kubernetes.io/projected/ab416851-500d-430c-80ee-4b06181c03e0-kube-api-access-n878n\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.276874 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab416851-500d-430c-80ee-4b06181c03e0-console-oauth-config\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.276921 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-console-config\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.276973 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-oauth-serving-cert\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.277029 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab416851-500d-430c-80ee-4b06181c03e0-console-serving-cert\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.277084 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-service-ca\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.278239 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-service-ca\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.278658 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-oauth-serving-cert\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.278764 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-trusted-ca-bundle\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.283774 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab416851-500d-430c-80ee-4b06181c03e0-console-config\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.286982 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab416851-500d-430c-80ee-4b06181c03e0-console-oauth-config\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.294289 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab416851-500d-430c-80ee-4b06181c03e0-console-serving-cert\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.297580 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n878n\" (UniqueName: \"kubernetes.io/projected/ab416851-500d-430c-80ee-4b06181c03e0-kube-api-access-n878n\") pod \"console-5767d7b4df-5ddh2\" (UID: \"ab416851-500d-430c-80ee-4b06181c03e0\") " pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.479718 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4febaade-1298-413f-8f68-ca4771613783-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-kphv7\" (UID: \"4febaade-1298-413f-8f68-ca4771613783\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.485337 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4febaade-1298-413f-8f68-ca4771613783-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-kphv7\" (UID: \"4febaade-1298-413f-8f68-ca4771613783\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.538114 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.557909 4772 generic.go:334] "Generic (PLEG): container finished" podID="f288de1a-c111-4df2-adfc-a2725320ada8" containerID="dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14" exitCode=0 Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.558030 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrxvs" event={"ID":"f288de1a-c111-4df2-adfc-a2725320ada8","Type":"ContainerDied","Data":"dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14"} Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.561010 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pf6qn" event={"ID":"477c7640-d169-487f-a2d7-9164f8b26417","Type":"ContainerStarted","Data":"0a99d3dee1384acf902e136e5b40855a84ec9581c708268f4254a1cf2d007e0e"} Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.585302 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-sll7d"] Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.673620 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.680954 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg"] Sep 30 17:14:34 crc kubenswrapper[4772]: W0930 17:14:34.702079 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf076e40b_6b99_4a23_8235_c008e4a209c5.slice/crio-04f5376f247650fe13bca7db9ea18aa013ccb0aa6a689e45bd4b141f87bfccae WatchSource:0}: Error finding container 04f5376f247650fe13bca7db9ea18aa013ccb0aa6a689e45bd4b141f87bfccae: Status 404 returned error can't find the container with id 04f5376f247650fe13bca7db9ea18aa013ccb0aa6a689e45bd4b141f87bfccae Sep 30 17:14:34 crc kubenswrapper[4772]: I0930 17:14:34.940801 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5767d7b4df-5ddh2"] Sep 30 17:14:34 crc kubenswrapper[4772]: W0930 17:14:34.944104 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab416851_500d_430c_80ee_4b06181c03e0.slice/crio-043e9914cd4a2681a710b2d21b585da72db1d4bf5bf7a0260a59565eb1530fad WatchSource:0}: Error finding container 043e9914cd4a2681a710b2d21b585da72db1d4bf5bf7a0260a59565eb1530fad: Status 404 returned error can't find the container with id 043e9914cd4a2681a710b2d21b585da72db1d4bf5bf7a0260a59565eb1530fad Sep 30 17:14:35 crc kubenswrapper[4772]: I0930 17:14:35.073249 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-kphv7"] Sep 30 17:14:35 crc kubenswrapper[4772]: I0930 17:14:35.567619 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" event={"ID":"f076e40b-6b99-4a23-8235-c008e4a209c5","Type":"ContainerStarted","Data":"04f5376f247650fe13bca7db9ea18aa013ccb0aa6a689e45bd4b141f87bfccae"} Sep 30 17:14:35 crc kubenswrapper[4772]: I0930 17:14:35.569080 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-sll7d" event={"ID":"aed9b88e-7f1b-472d-a22c-ebf719c71f73","Type":"ContainerStarted","Data":"4136481c9ef718ca9890022ff2de95e0f6f802a2493ec884c0bcd396f8b85df3"} Sep 30 17:14:35 crc kubenswrapper[4772]: I0930 17:14:35.570606 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5767d7b4df-5ddh2" event={"ID":"ab416851-500d-430c-80ee-4b06181c03e0","Type":"ContainerStarted","Data":"ebc506029dee7d9b1571b212f9a9dcc7138d369f1f3877a5b4b5882ad089a8e2"} Sep 30 17:14:35 crc kubenswrapper[4772]: I0930 17:14:35.570657 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5767d7b4df-5ddh2" event={"ID":"ab416851-500d-430c-80ee-4b06181c03e0","Type":"ContainerStarted","Data":"043e9914cd4a2681a710b2d21b585da72db1d4bf5bf7a0260a59565eb1530fad"} Sep 30 17:14:35 crc kubenswrapper[4772]: I0930 17:14:35.571634 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" event={"ID":"4febaade-1298-413f-8f68-ca4771613783","Type":"ContainerStarted","Data":"44914cffdd73418ca94c68137fcc43fefbe71fab4ebba6ed8f7f446eae8e4627"} Sep 30 17:14:35 crc kubenswrapper[4772]: I0930 17:14:35.593362 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5767d7b4df-5ddh2" podStartSLOduration=1.59334271 podStartE2EDuration="1.59334271s" podCreationTimestamp="2025-09-30 17:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:14:35.589693585 +0000 UTC m=+776.496706416" watchObservedRunningTime="2025-09-30 17:14:35.59334271 +0000 UTC m=+776.500355541" Sep 30 17:14:36 crc kubenswrapper[4772]: I0930 17:14:36.583566 4772 generic.go:334] "Generic (PLEG): container finished" podID="f288de1a-c111-4df2-adfc-a2725320ada8" containerID="f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb" exitCode=0 Sep 30 17:14:36 crc kubenswrapper[4772]: I0930 17:14:36.583690 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrxvs" event={"ID":"f288de1a-c111-4df2-adfc-a2725320ada8","Type":"ContainerDied","Data":"f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb"} Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.600027 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pf6qn" event={"ID":"477c7640-d169-487f-a2d7-9164f8b26417","Type":"ContainerStarted","Data":"88492ae6189710aeec70b88d874b50529e042c29093680bb4cec71c0fcf67c91"} Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.600656 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.601680 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" event={"ID":"f076e40b-6b99-4a23-8235-c008e4a209c5","Type":"ContainerStarted","Data":"aa913711ccd67897fc06c4d7b17c7bdd1a1ba059f75684219d3c79c74ff25591"} Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.603205 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-sll7d" event={"ID":"aed9b88e-7f1b-472d-a22c-ebf719c71f73","Type":"ContainerStarted","Data":"6e66c498c9e2325e5534271f8f8766e910b2a983caac01055f230eb5624ee693"} Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.604500 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" event={"ID":"4febaade-1298-413f-8f68-ca4771613783","Type":"ContainerStarted","Data":"b1523b0d170026b1f39e6aa044ef9fdc34380b0f0b7c9029c5b419b74724d6cc"} Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.604610 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.607863 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrxvs" event={"ID":"f288de1a-c111-4df2-adfc-a2725320ada8","Type":"ContainerStarted","Data":"e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf"} Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.625828 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-pf6qn" podStartSLOduration=2.28258583 podStartE2EDuration="5.625800473s" podCreationTimestamp="2025-09-30 17:14:33 +0000 UTC" firstStartedPulling="2025-09-30 17:14:34.141625038 +0000 UTC m=+775.048637869" lastFinishedPulling="2025-09-30 17:14:37.484839661 +0000 UTC m=+778.391852512" observedRunningTime="2025-09-30 17:14:38.621467919 +0000 UTC m=+779.528480770" watchObservedRunningTime="2025-09-30 17:14:38.625800473 +0000 UTC m=+779.532813324" Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.647916 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wrxvs" podStartSLOduration=3.705902096 podStartE2EDuration="6.64789177s" podCreationTimestamp="2025-09-30 17:14:32 +0000 UTC" firstStartedPulling="2025-09-30 17:14:34.560369935 +0000 UTC m=+775.467382786" lastFinishedPulling="2025-09-30 17:14:37.502359629 +0000 UTC m=+778.409372460" observedRunningTime="2025-09-30 17:14:38.642886589 +0000 UTC m=+779.549899440" watchObservedRunningTime="2025-09-30 17:14:38.64789177 +0000 UTC m=+779.554904611" Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.656108 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.656181 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.662550 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" podStartSLOduration=3.21630209 podStartE2EDuration="5.662527833s" podCreationTimestamp="2025-09-30 17:14:33 +0000 UTC" firstStartedPulling="2025-09-30 17:14:35.081622932 +0000 UTC m=+775.988635763" lastFinishedPulling="2025-09-30 17:14:37.527848665 +0000 UTC m=+778.434861506" observedRunningTime="2025-09-30 17:14:38.661812364 +0000 UTC m=+779.568825215" watchObservedRunningTime="2025-09-30 17:14:38.662527833 +0000 UTC m=+779.569540664" Sep 30 17:14:38 crc kubenswrapper[4772]: I0930 17:14:38.704690 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-m6pgg" podStartSLOduration=2.911370818 podStartE2EDuration="5.704673715s" podCreationTimestamp="2025-09-30 17:14:33 +0000 UTC" firstStartedPulling="2025-09-30 17:14:34.706568647 +0000 UTC m=+775.613581478" lastFinishedPulling="2025-09-30 17:14:37.499871534 +0000 UTC m=+778.406884375" observedRunningTime="2025-09-30 17:14:38.703408082 +0000 UTC m=+779.610420933" watchObservedRunningTime="2025-09-30 17:14:38.704673715 +0000 UTC m=+779.611686546" Sep 30 17:14:40 crc kubenswrapper[4772]: I0930 17:14:40.631153 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-sll7d" event={"ID":"aed9b88e-7f1b-472d-a22c-ebf719c71f73","Type":"ContainerStarted","Data":"eb675a04d0516b5b36ce33f5f85bf245ea7a228773bd57cbb823c5c8bdbce925"} Sep 30 17:14:42 crc kubenswrapper[4772]: I0930 17:14:42.895617 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:42 crc kubenswrapper[4772]: I0930 17:14:42.895973 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:42 crc kubenswrapper[4772]: I0930 17:14:42.933314 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:42 crc kubenswrapper[4772]: I0930 17:14:42.953412 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-sll7d" podStartSLOduration=4.8122650270000005 podStartE2EDuration="9.953393287s" podCreationTimestamp="2025-09-30 17:14:33 +0000 UTC" firstStartedPulling="2025-09-30 17:14:34.593391688 +0000 UTC m=+775.500404519" lastFinishedPulling="2025-09-30 17:14:39.734519948 +0000 UTC m=+780.641532779" observedRunningTime="2025-09-30 17:14:40.649733173 +0000 UTC m=+781.556746044" watchObservedRunningTime="2025-09-30 17:14:42.953393287 +0000 UTC m=+783.860406118" Sep 30 17:14:43 crc kubenswrapper[4772]: I0930 17:14:43.684758 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:43 crc kubenswrapper[4772]: I0930 17:14:43.762852 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wrxvs"] Sep 30 17:14:44 crc kubenswrapper[4772]: I0930 17:14:44.121780 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-pf6qn" Sep 30 17:14:44 crc kubenswrapper[4772]: I0930 17:14:44.539489 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:44 crc kubenswrapper[4772]: I0930 17:14:44.539578 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:44 crc kubenswrapper[4772]: I0930 17:14:44.547602 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:44 crc kubenswrapper[4772]: I0930 17:14:44.656705 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5767d7b4df-5ddh2" Sep 30 17:14:44 crc kubenswrapper[4772]: I0930 17:14:44.709605 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tm4sk"] Sep 30 17:14:45 crc kubenswrapper[4772]: I0930 17:14:45.662363 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wrxvs" podUID="f288de1a-c111-4df2-adfc-a2725320ada8" containerName="registry-server" containerID="cri-o://e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf" gracePeriod=2 Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.237560 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.353854 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sckg4\" (UniqueName: \"kubernetes.io/projected/f288de1a-c111-4df2-adfc-a2725320ada8-kube-api-access-sckg4\") pod \"f288de1a-c111-4df2-adfc-a2725320ada8\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.354249 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-utilities\") pod \"f288de1a-c111-4df2-adfc-a2725320ada8\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.354303 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-catalog-content\") pod \"f288de1a-c111-4df2-adfc-a2725320ada8\" (UID: \"f288de1a-c111-4df2-adfc-a2725320ada8\") " Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.355218 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-utilities" (OuterVolumeSpecName: "utilities") pod "f288de1a-c111-4df2-adfc-a2725320ada8" (UID: "f288de1a-c111-4df2-adfc-a2725320ada8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.363867 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f288de1a-c111-4df2-adfc-a2725320ada8-kube-api-access-sckg4" (OuterVolumeSpecName: "kube-api-access-sckg4") pod "f288de1a-c111-4df2-adfc-a2725320ada8" (UID: "f288de1a-c111-4df2-adfc-a2725320ada8"). InnerVolumeSpecName "kube-api-access-sckg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.455696 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.455741 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sckg4\" (UniqueName: \"kubernetes.io/projected/f288de1a-c111-4df2-adfc-a2725320ada8-kube-api-access-sckg4\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.678712 4772 generic.go:334] "Generic (PLEG): container finished" podID="f288de1a-c111-4df2-adfc-a2725320ada8" containerID="e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf" exitCode=0 Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.678786 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrxvs" event={"ID":"f288de1a-c111-4df2-adfc-a2725320ada8","Type":"ContainerDied","Data":"e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf"} Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.678820 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrxvs" event={"ID":"f288de1a-c111-4df2-adfc-a2725320ada8","Type":"ContainerDied","Data":"d3b7ab78e71dbf8fed88b2a9e0cd553cfbd0e61952d21128250e72676d9a3e7e"} Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.678842 4772 scope.go:117] "RemoveContainer" containerID="e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.678860 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrxvs" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.697754 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f288de1a-c111-4df2-adfc-a2725320ada8" (UID: "f288de1a-c111-4df2-adfc-a2725320ada8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.707456 4772 scope.go:117] "RemoveContainer" containerID="f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.727987 4772 scope.go:117] "RemoveContainer" containerID="dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.756128 4772 scope.go:117] "RemoveContainer" containerID="e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf" Sep 30 17:14:46 crc kubenswrapper[4772]: E0930 17:14:46.756709 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf\": container with ID starting with e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf not found: ID does not exist" containerID="e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.756759 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf"} err="failed to get container status \"e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf\": rpc error: code = NotFound desc = could not find container \"e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf\": container with ID starting with e6aff7f19eedbde5aaaa31677775334093dfc8f3da60b485e40f0e67061c2ccf not found: ID does not exist" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.756787 4772 scope.go:117] "RemoveContainer" containerID="f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb" Sep 30 17:14:46 crc kubenswrapper[4772]: E0930 17:14:46.757217 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb\": container with ID starting with f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb not found: ID does not exist" containerID="f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.757268 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb"} err="failed to get container status \"f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb\": rpc error: code = NotFound desc = could not find container \"f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb\": container with ID starting with f6370177b6b600b8e2db01bfa2e7ec264f1502840777d4c5a2b771d54233bffb not found: ID does not exist" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.757303 4772 scope.go:117] "RemoveContainer" containerID="dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14" Sep 30 17:14:46 crc kubenswrapper[4772]: E0930 17:14:46.757644 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14\": container with ID starting with dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14 not found: ID does not exist" containerID="dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.757685 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14"} err="failed to get container status \"dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14\": rpc error: code = NotFound desc = could not find container \"dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14\": container with ID starting with dcbfe8411038b82b45fd6730e2d66afbc8df38db7bf77231dd14ee513060aa14 not found: ID does not exist" Sep 30 17:14:46 crc kubenswrapper[4772]: I0930 17:14:46.763208 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f288de1a-c111-4df2-adfc-a2725320ada8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:14:47 crc kubenswrapper[4772]: I0930 17:14:47.023605 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wrxvs"] Sep 30 17:14:47 crc kubenswrapper[4772]: I0930 17:14:47.035941 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wrxvs"] Sep 30 17:14:47 crc kubenswrapper[4772]: I0930 17:14:47.906574 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f288de1a-c111-4df2-adfc-a2725320ada8" path="/var/lib/kubelet/pods/f288de1a-c111-4df2-adfc-a2725320ada8/volumes" Sep 30 17:14:54 crc kubenswrapper[4772]: I0930 17:14:54.680464 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-kphv7" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.136419 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5"] Sep 30 17:15:00 crc kubenswrapper[4772]: E0930 17:15:00.137419 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f288de1a-c111-4df2-adfc-a2725320ada8" containerName="extract-utilities" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.137437 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f288de1a-c111-4df2-adfc-a2725320ada8" containerName="extract-utilities" Sep 30 17:15:00 crc kubenswrapper[4772]: E0930 17:15:00.137460 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f288de1a-c111-4df2-adfc-a2725320ada8" containerName="registry-server" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.137468 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f288de1a-c111-4df2-adfc-a2725320ada8" containerName="registry-server" Sep 30 17:15:00 crc kubenswrapper[4772]: E0930 17:15:00.137481 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f288de1a-c111-4df2-adfc-a2725320ada8" containerName="extract-content" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.137491 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f288de1a-c111-4df2-adfc-a2725320ada8" containerName="extract-content" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.137629 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f288de1a-c111-4df2-adfc-a2725320ada8" containerName="registry-server" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.138154 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.141291 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.141594 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.147422 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5"] Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.334254 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbpvv\" (UniqueName: \"kubernetes.io/projected/3f91f03b-d4a4-4907-9b91-1f8098230413-kube-api-access-cbpvv\") pod \"collect-profiles-29320875-np9f5\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.334315 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f91f03b-d4a4-4907-9b91-1f8098230413-secret-volume\") pod \"collect-profiles-29320875-np9f5\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.334357 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f91f03b-d4a4-4907-9b91-1f8098230413-config-volume\") pod \"collect-profiles-29320875-np9f5\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.435421 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbpvv\" (UniqueName: \"kubernetes.io/projected/3f91f03b-d4a4-4907-9b91-1f8098230413-kube-api-access-cbpvv\") pod \"collect-profiles-29320875-np9f5\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.435463 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f91f03b-d4a4-4907-9b91-1f8098230413-secret-volume\") pod \"collect-profiles-29320875-np9f5\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.435553 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f91f03b-d4a4-4907-9b91-1f8098230413-config-volume\") pod \"collect-profiles-29320875-np9f5\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.436741 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f91f03b-d4a4-4907-9b91-1f8098230413-config-volume\") pod \"collect-profiles-29320875-np9f5\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.443748 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f91f03b-d4a4-4907-9b91-1f8098230413-secret-volume\") pod \"collect-profiles-29320875-np9f5\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.457806 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbpvv\" (UniqueName: \"kubernetes.io/projected/3f91f03b-d4a4-4907-9b91-1f8098230413-kube-api-access-cbpvv\") pod \"collect-profiles-29320875-np9f5\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.473860 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:00 crc kubenswrapper[4772]: I0930 17:15:00.897198 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5"] Sep 30 17:15:00 crc kubenswrapper[4772]: W0930 17:15:00.900044 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f91f03b_d4a4_4907_9b91_1f8098230413.slice/crio-50f836e7de60b84d30e0e8a99467f48c30f29aff2c88f1ce38767a099bf55ab2 WatchSource:0}: Error finding container 50f836e7de60b84d30e0e8a99467f48c30f29aff2c88f1ce38767a099bf55ab2: Status 404 returned error can't find the container with id 50f836e7de60b84d30e0e8a99467f48c30f29aff2c88f1ce38767a099bf55ab2 Sep 30 17:15:01 crc kubenswrapper[4772]: I0930 17:15:01.776679 4772 generic.go:334] "Generic (PLEG): container finished" podID="3f91f03b-d4a4-4907-9b91-1f8098230413" containerID="a905f9a2e34e13db6089bc782da3476e7115e86507f9acc4c8f663ddb440e72e" exitCode=0 Sep 30 17:15:01 crc kubenswrapper[4772]: I0930 17:15:01.776784 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" event={"ID":"3f91f03b-d4a4-4907-9b91-1f8098230413","Type":"ContainerDied","Data":"a905f9a2e34e13db6089bc782da3476e7115e86507f9acc4c8f663ddb440e72e"} Sep 30 17:15:01 crc kubenswrapper[4772]: I0930 17:15:01.777571 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" event={"ID":"3f91f03b-d4a4-4907-9b91-1f8098230413","Type":"ContainerStarted","Data":"50f836e7de60b84d30e0e8a99467f48c30f29aff2c88f1ce38767a099bf55ab2"} Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.099955 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.196921 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f91f03b-d4a4-4907-9b91-1f8098230413-config-volume\") pod \"3f91f03b-d4a4-4907-9b91-1f8098230413\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.197067 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbpvv\" (UniqueName: \"kubernetes.io/projected/3f91f03b-d4a4-4907-9b91-1f8098230413-kube-api-access-cbpvv\") pod \"3f91f03b-d4a4-4907-9b91-1f8098230413\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.197107 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f91f03b-d4a4-4907-9b91-1f8098230413-secret-volume\") pod \"3f91f03b-d4a4-4907-9b91-1f8098230413\" (UID: \"3f91f03b-d4a4-4907-9b91-1f8098230413\") " Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.198121 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f91f03b-d4a4-4907-9b91-1f8098230413-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f91f03b-d4a4-4907-9b91-1f8098230413" (UID: "3f91f03b-d4a4-4907-9b91-1f8098230413"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.208363 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f91f03b-d4a4-4907-9b91-1f8098230413-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f91f03b-d4a4-4907-9b91-1f8098230413" (UID: "3f91f03b-d4a4-4907-9b91-1f8098230413"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.209433 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f91f03b-d4a4-4907-9b91-1f8098230413-kube-api-access-cbpvv" (OuterVolumeSpecName: "kube-api-access-cbpvv") pod "3f91f03b-d4a4-4907-9b91-1f8098230413" (UID: "3f91f03b-d4a4-4907-9b91-1f8098230413"). InnerVolumeSpecName "kube-api-access-cbpvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.298990 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f91f03b-d4a4-4907-9b91-1f8098230413-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.299047 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbpvv\" (UniqueName: \"kubernetes.io/projected/3f91f03b-d4a4-4907-9b91-1f8098230413-kube-api-access-cbpvv\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.299077 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f91f03b-d4a4-4907-9b91-1f8098230413-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.801518 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" event={"ID":"3f91f03b-d4a4-4907-9b91-1f8098230413","Type":"ContainerDied","Data":"50f836e7de60b84d30e0e8a99467f48c30f29aff2c88f1ce38767a099bf55ab2"} Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.801853 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50f836e7de60b84d30e0e8a99467f48c30f29aff2c88f1ce38767a099bf55ab2" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.801568 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.858870 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lqxzw"] Sep 30 17:15:03 crc kubenswrapper[4772]: E0930 17:15:03.859337 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f91f03b-d4a4-4907-9b91-1f8098230413" containerName="collect-profiles" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.859358 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f91f03b-d4a4-4907-9b91-1f8098230413" containerName="collect-profiles" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.859534 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f91f03b-d4a4-4907-9b91-1f8098230413" containerName="collect-profiles" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.863842 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.866871 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lqxzw"] Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.921547 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz26s\" (UniqueName: \"kubernetes.io/projected/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-kube-api-access-xz26s\") pod \"community-operators-lqxzw\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.921751 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-utilities\") pod \"community-operators-lqxzw\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:03 crc kubenswrapper[4772]: I0930 17:15:03.921833 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-catalog-content\") pod \"community-operators-lqxzw\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.023756 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-utilities\") pod \"community-operators-lqxzw\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.023872 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-catalog-content\") pod \"community-operators-lqxzw\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.023974 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz26s\" (UniqueName: \"kubernetes.io/projected/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-kube-api-access-xz26s\") pod \"community-operators-lqxzw\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.025675 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-utilities\") pod \"community-operators-lqxzw\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.025713 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-catalog-content\") pod \"community-operators-lqxzw\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.049157 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz26s\" (UniqueName: \"kubernetes.io/projected/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-kube-api-access-xz26s\") pod \"community-operators-lqxzw\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.241748 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.561208 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lqxzw"] Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.809780 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerID="5f3a50a575975e422e0fd6d17719a9646827a944fe86ed598728ee8dc2038b2e" exitCode=0 Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.809918 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxzw" event={"ID":"ea174d60-5a6c-4fbe-8edc-0b1b61f69549","Type":"ContainerDied","Data":"5f3a50a575975e422e0fd6d17719a9646827a944fe86ed598728ee8dc2038b2e"} Sep 30 17:15:04 crc kubenswrapper[4772]: I0930 17:15:04.810289 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxzw" event={"ID":"ea174d60-5a6c-4fbe-8edc-0b1b61f69549","Type":"ContainerStarted","Data":"bca0e3a809b890ef173213951fe0c8236cb755bb98d6f1ab6fd2ca7dbdadf8bc"} Sep 30 17:15:06 crc kubenswrapper[4772]: I0930 17:15:06.834517 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerID="0064fedc2f420c4315f6053aeb1d577756920160470a144cc1e93d3e9fc854b3" exitCode=0 Sep 30 17:15:06 crc kubenswrapper[4772]: I0930 17:15:06.834592 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxzw" event={"ID":"ea174d60-5a6c-4fbe-8edc-0b1b61f69549","Type":"ContainerDied","Data":"0064fedc2f420c4315f6053aeb1d577756920160470a144cc1e93d3e9fc854b3"} Sep 30 17:15:08 crc kubenswrapper[4772]: I0930 17:15:08.656816 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:15:08 crc kubenswrapper[4772]: I0930 17:15:08.657321 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:15:08 crc kubenswrapper[4772]: I0930 17:15:08.657393 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:15:08 crc kubenswrapper[4772]: I0930 17:15:08.658268 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c4130d132bd9ba1e58ca9105011cc1089aeabb461da2027bde96f24d0137622"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:15:08 crc kubenswrapper[4772]: I0930 17:15:08.658360 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://9c4130d132bd9ba1e58ca9105011cc1089aeabb461da2027bde96f24d0137622" gracePeriod=600 Sep 30 17:15:08 crc kubenswrapper[4772]: I0930 17:15:08.868785 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxzw" event={"ID":"ea174d60-5a6c-4fbe-8edc-0b1b61f69549","Type":"ContainerStarted","Data":"0ea7b20b1e37cfa3e6489addd10088731854a1a3a6c6fedfc991234b43683e28"} Sep 30 17:15:08 crc kubenswrapper[4772]: I0930 17:15:08.873190 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="9c4130d132bd9ba1e58ca9105011cc1089aeabb461da2027bde96f24d0137622" exitCode=0 Sep 30 17:15:08 crc kubenswrapper[4772]: I0930 17:15:08.873227 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"9c4130d132bd9ba1e58ca9105011cc1089aeabb461da2027bde96f24d0137622"} Sep 30 17:15:08 crc kubenswrapper[4772]: I0930 17:15:08.873254 4772 scope.go:117] "RemoveContainer" containerID="8d300de23ff5fdda967fc356bca4e8a110fd4878bedac23ae19b92c618fe6c8a" Sep 30 17:15:08 crc kubenswrapper[4772]: I0930 17:15:08.897209 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lqxzw" podStartSLOduration=3.0517807 podStartE2EDuration="5.897190942s" podCreationTimestamp="2025-09-30 17:15:03 +0000 UTC" firstStartedPulling="2025-09-30 17:15:04.812450579 +0000 UTC m=+805.719463420" lastFinishedPulling="2025-09-30 17:15:07.657860831 +0000 UTC m=+808.564873662" observedRunningTime="2025-09-30 17:15:08.896314439 +0000 UTC m=+809.803327270" watchObservedRunningTime="2025-09-30 17:15:08.897190942 +0000 UTC m=+809.804203783" Sep 30 17:15:09 crc kubenswrapper[4772]: I0930 17:15:09.746743 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-tm4sk" podUID="d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" containerName="console" containerID="cri-o://50630cf5ccfe12326dbbb3c8ab68a449dceff0f0bf220a060582e638c809cb8b" gracePeriod=15 Sep 30 17:15:09 crc kubenswrapper[4772]: I0930 17:15:09.960432 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"abe03f1cdb5c96e46a9cb2863de12ede67a8becb76c4e1cb373ac762e5589161"} Sep 30 17:15:09 crc kubenswrapper[4772]: I0930 17:15:09.986680 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tm4sk_d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7/console/0.log" Sep 30 17:15:09 crc kubenswrapper[4772]: I0930 17:15:09.986729 4772 generic.go:334] "Generic (PLEG): container finished" podID="d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" containerID="50630cf5ccfe12326dbbb3c8ab68a449dceff0f0bf220a060582e638c809cb8b" exitCode=2 Sep 30 17:15:09 crc kubenswrapper[4772]: I0930 17:15:09.987044 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tm4sk" event={"ID":"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7","Type":"ContainerDied","Data":"50630cf5ccfe12326dbbb3c8ab68a449dceff0f0bf220a060582e638c809cb8b"} Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.285182 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tm4sk_d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7/console/0.log" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.285472 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.316602 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-oauth-serving-cert\") pod \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.316676 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-trusted-ca-bundle\") pod \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.316750 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-config\") pod \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.316827 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-serving-cert\") pod \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.316865 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-service-ca\") pod \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.316912 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-oauth-config\") pod \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.316952 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m9vl\" (UniqueName: \"kubernetes.io/projected/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-kube-api-access-4m9vl\") pod \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\" (UID: \"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7\") " Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.318077 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" (UID: "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.318046 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-service-ca" (OuterVolumeSpecName: "service-ca") pod "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" (UID: "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.318142 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-config" (OuterVolumeSpecName: "console-config") pod "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" (UID: "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.318162 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" (UID: "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.322658 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w"] Sep 30 17:15:10 crc kubenswrapper[4772]: E0930 17:15:10.323203 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" containerName="console" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.323242 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" containerName="console" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.323384 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" containerName="console" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.324448 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.327138 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w"] Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.333436 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.336225 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-kube-api-access-4m9vl" (OuterVolumeSpecName: "kube-api-access-4m9vl") pod "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" (UID: "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7"). InnerVolumeSpecName "kube-api-access-4m9vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.342711 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" (UID: "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.348007 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" (UID: "d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.419768 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.419888 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.419915 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjz6g\" (UniqueName: \"kubernetes.io/projected/d4f52924-d141-4724-838f-d3bfd6dab358-kube-api-access-jjz6g\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.419981 4772 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.419997 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.420009 4772 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.420020 4772 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.420032 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.420043 4772 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.420078 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m9vl\" (UniqueName: \"kubernetes.io/projected/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7-kube-api-access-4m9vl\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.521610 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.521664 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjz6g\" (UniqueName: \"kubernetes.io/projected/d4f52924-d141-4724-838f-d3bfd6dab358-kube-api-access-jjz6g\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.521732 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.522357 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.522446 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.553620 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjz6g\" (UniqueName: \"kubernetes.io/projected/d4f52924-d141-4724-838f-d3bfd6dab358-kube-api-access-jjz6g\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.691105 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.993612 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tm4sk_d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7/console/0.log" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.994259 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tm4sk" Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.994254 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tm4sk" event={"ID":"d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7","Type":"ContainerDied","Data":"998485b1c9b72c3a49c26832d0c80244e39846f8113a4b353c4d6fcd69b0181c"} Sep 30 17:15:10 crc kubenswrapper[4772]: I0930 17:15:10.994340 4772 scope.go:117] "RemoveContainer" containerID="50630cf5ccfe12326dbbb3c8ab68a449dceff0f0bf220a060582e638c809cb8b" Sep 30 17:15:11 crc kubenswrapper[4772]: I0930 17:15:11.026325 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tm4sk"] Sep 30 17:15:11 crc kubenswrapper[4772]: I0930 17:15:11.032305 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-tm4sk"] Sep 30 17:15:11 crc kubenswrapper[4772]: I0930 17:15:11.153281 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w"] Sep 30 17:15:11 crc kubenswrapper[4772]: I0930 17:15:11.905732 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7" path="/var/lib/kubelet/pods/d6c5eaf7-d5ff-49e6-9b1a-a02ecd7f5ea7/volumes" Sep 30 17:15:11 crc kubenswrapper[4772]: I0930 17:15:11.999553 4772 generic.go:334] "Generic (PLEG): container finished" podID="d4f52924-d141-4724-838f-d3bfd6dab358" containerID="1641ada822a072bbc031fe451b5ca0edf1b3295ac2372ad7c2628c85cc38e6f3" exitCode=0 Sep 30 17:15:11 crc kubenswrapper[4772]: I0930 17:15:11.999611 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" event={"ID":"d4f52924-d141-4724-838f-d3bfd6dab358","Type":"ContainerDied","Data":"1641ada822a072bbc031fe451b5ca0edf1b3295ac2372ad7c2628c85cc38e6f3"} Sep 30 17:15:11 crc kubenswrapper[4772]: I0930 17:15:11.999636 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" event={"ID":"d4f52924-d141-4724-838f-d3bfd6dab358","Type":"ContainerStarted","Data":"9faa47b9976cd8f25257ddea866b2f3a63d55bbe673bc3b518425dfc4d46a85d"} Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.013658 4772 generic.go:334] "Generic (PLEG): container finished" podID="d4f52924-d141-4724-838f-d3bfd6dab358" containerID="8addfd39809f75402e2847c2390c07bb7aba3101ebed6745ef130756f90ce38f" exitCode=0 Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.013729 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" event={"ID":"d4f52924-d141-4724-838f-d3bfd6dab358","Type":"ContainerDied","Data":"8addfd39809f75402e2847c2390c07bb7aba3101ebed6745ef130756f90ce38f"} Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.242182 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f85gr"] Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.244825 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.244856 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.245283 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.260964 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f85gr"] Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.276987 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-catalog-content\") pod \"redhat-marketplace-f85gr\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.277039 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjqn8\" (UniqueName: \"kubernetes.io/projected/8940bf5d-2be4-4de3-88b0-26455e2338b4-kube-api-access-zjqn8\") pod \"redhat-marketplace-f85gr\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.277114 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-utilities\") pod \"redhat-marketplace-f85gr\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.305526 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.378012 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-catalog-content\") pod \"redhat-marketplace-f85gr\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.378051 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjqn8\" (UniqueName: \"kubernetes.io/projected/8940bf5d-2be4-4de3-88b0-26455e2338b4-kube-api-access-zjqn8\") pod \"redhat-marketplace-f85gr\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.378104 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-utilities\") pod \"redhat-marketplace-f85gr\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.378539 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-catalog-content\") pod \"redhat-marketplace-f85gr\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.378572 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-utilities\") pod \"redhat-marketplace-f85gr\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.398839 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjqn8\" (UniqueName: \"kubernetes.io/projected/8940bf5d-2be4-4de3-88b0-26455e2338b4-kube-api-access-zjqn8\") pod \"redhat-marketplace-f85gr\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:14 crc kubenswrapper[4772]: I0930 17:15:14.571190 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:15 crc kubenswrapper[4772]: I0930 17:15:15.023634 4772 generic.go:334] "Generic (PLEG): container finished" podID="d4f52924-d141-4724-838f-d3bfd6dab358" containerID="ab298a134d1a7e3aa0b0e8578e9d82b66400e797901364b20ede947544d526e4" exitCode=0 Sep 30 17:15:15 crc kubenswrapper[4772]: I0930 17:15:15.023730 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" event={"ID":"d4f52924-d141-4724-838f-d3bfd6dab358","Type":"ContainerDied","Data":"ab298a134d1a7e3aa0b0e8578e9d82b66400e797901364b20ede947544d526e4"} Sep 30 17:15:15 crc kubenswrapper[4772]: I0930 17:15:15.063264 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:15 crc kubenswrapper[4772]: I0930 17:15:15.128170 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f85gr"] Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.031094 4772 generic.go:334] "Generic (PLEG): container finished" podID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerID="d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce" exitCode=0 Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.031191 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f85gr" event={"ID":"8940bf5d-2be4-4de3-88b0-26455e2338b4","Type":"ContainerDied","Data":"d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce"} Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.031510 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f85gr" event={"ID":"8940bf5d-2be4-4de3-88b0-26455e2338b4","Type":"ContainerStarted","Data":"605e9f68938630614c04465a9660693ffdbcebef063eab42c3beea7c312fc0e4"} Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.278569 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.406187 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjz6g\" (UniqueName: \"kubernetes.io/projected/d4f52924-d141-4724-838f-d3bfd6dab358-kube-api-access-jjz6g\") pod \"d4f52924-d141-4724-838f-d3bfd6dab358\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.406318 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-bundle\") pod \"d4f52924-d141-4724-838f-d3bfd6dab358\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.406370 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-util\") pod \"d4f52924-d141-4724-838f-d3bfd6dab358\" (UID: \"d4f52924-d141-4724-838f-d3bfd6dab358\") " Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.407821 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-bundle" (OuterVolumeSpecName: "bundle") pod "d4f52924-d141-4724-838f-d3bfd6dab358" (UID: "d4f52924-d141-4724-838f-d3bfd6dab358"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.412889 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f52924-d141-4724-838f-d3bfd6dab358-kube-api-access-jjz6g" (OuterVolumeSpecName: "kube-api-access-jjz6g") pod "d4f52924-d141-4724-838f-d3bfd6dab358" (UID: "d4f52924-d141-4724-838f-d3bfd6dab358"). InnerVolumeSpecName "kube-api-access-jjz6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.507836 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjz6g\" (UniqueName: \"kubernetes.io/projected/d4f52924-d141-4724-838f-d3bfd6dab358-kube-api-access-jjz6g\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:16 crc kubenswrapper[4772]: I0930 17:15:16.507872 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:17 crc kubenswrapper[4772]: I0930 17:15:17.045584 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f85gr" event={"ID":"8940bf5d-2be4-4de3-88b0-26455e2338b4","Type":"ContainerStarted","Data":"74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34"} Sep 30 17:15:17 crc kubenswrapper[4772]: I0930 17:15:17.050608 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" event={"ID":"d4f52924-d141-4724-838f-d3bfd6dab358","Type":"ContainerDied","Data":"9faa47b9976cd8f25257ddea866b2f3a63d55bbe673bc3b518425dfc4d46a85d"} Sep 30 17:15:17 crc kubenswrapper[4772]: I0930 17:15:17.050658 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9faa47b9976cd8f25257ddea866b2f3a63d55bbe673bc3b518425dfc4d46a85d" Sep 30 17:15:17 crc kubenswrapper[4772]: I0930 17:15:17.050717 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w" Sep 30 17:15:17 crc kubenswrapper[4772]: I0930 17:15:17.398262 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-util" (OuterVolumeSpecName: "util") pod "d4f52924-d141-4724-838f-d3bfd6dab358" (UID: "d4f52924-d141-4724-838f-d3bfd6dab358"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:15:17 crc kubenswrapper[4772]: I0930 17:15:17.424739 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4f52924-d141-4724-838f-d3bfd6dab358-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:17 crc kubenswrapper[4772]: I0930 17:15:17.843603 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lqxzw"] Sep 30 17:15:17 crc kubenswrapper[4772]: I0930 17:15:17.843856 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lqxzw" podUID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerName="registry-server" containerID="cri-o://0ea7b20b1e37cfa3e6489addd10088731854a1a3a6c6fedfc991234b43683e28" gracePeriod=2 Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.062166 4772 generic.go:334] "Generic (PLEG): container finished" podID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerID="74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34" exitCode=0 Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.062390 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f85gr" event={"ID":"8940bf5d-2be4-4de3-88b0-26455e2338b4","Type":"ContainerDied","Data":"74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34"} Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.065279 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerID="0ea7b20b1e37cfa3e6489addd10088731854a1a3a6c6fedfc991234b43683e28" exitCode=0 Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.065304 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxzw" event={"ID":"ea174d60-5a6c-4fbe-8edc-0b1b61f69549","Type":"ContainerDied","Data":"0ea7b20b1e37cfa3e6489addd10088731854a1a3a6c6fedfc991234b43683e28"} Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.243517 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.337377 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz26s\" (UniqueName: \"kubernetes.io/projected/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-kube-api-access-xz26s\") pod \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.337545 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-utilities\") pod \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.337598 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-catalog-content\") pod \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\" (UID: \"ea174d60-5a6c-4fbe-8edc-0b1b61f69549\") " Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.338388 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-utilities" (OuterVolumeSpecName: "utilities") pod "ea174d60-5a6c-4fbe-8edc-0b1b61f69549" (UID: "ea174d60-5a6c-4fbe-8edc-0b1b61f69549"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.344236 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-kube-api-access-xz26s" (OuterVolumeSpecName: "kube-api-access-xz26s") pod "ea174d60-5a6c-4fbe-8edc-0b1b61f69549" (UID: "ea174d60-5a6c-4fbe-8edc-0b1b61f69549"). InnerVolumeSpecName "kube-api-access-xz26s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.387509 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea174d60-5a6c-4fbe-8edc-0b1b61f69549" (UID: "ea174d60-5a6c-4fbe-8edc-0b1b61f69549"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.439402 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.439466 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:18 crc kubenswrapper[4772]: I0930 17:15:18.439478 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz26s\" (UniqueName: \"kubernetes.io/projected/ea174d60-5a6c-4fbe-8edc-0b1b61f69549-kube-api-access-xz26s\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:19 crc kubenswrapper[4772]: I0930 17:15:19.074547 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxzw" event={"ID":"ea174d60-5a6c-4fbe-8edc-0b1b61f69549","Type":"ContainerDied","Data":"bca0e3a809b890ef173213951fe0c8236cb755bb98d6f1ab6fd2ca7dbdadf8bc"} Sep 30 17:15:19 crc kubenswrapper[4772]: I0930 17:15:19.074913 4772 scope.go:117] "RemoveContainer" containerID="0ea7b20b1e37cfa3e6489addd10088731854a1a3a6c6fedfc991234b43683e28" Sep 30 17:15:19 crc kubenswrapper[4772]: I0930 17:15:19.074611 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqxzw" Sep 30 17:15:19 crc kubenswrapper[4772]: I0930 17:15:19.077249 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f85gr" event={"ID":"8940bf5d-2be4-4de3-88b0-26455e2338b4","Type":"ContainerStarted","Data":"ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c"} Sep 30 17:15:19 crc kubenswrapper[4772]: I0930 17:15:19.092289 4772 scope.go:117] "RemoveContainer" containerID="0064fedc2f420c4315f6053aeb1d577756920160470a144cc1e93d3e9fc854b3" Sep 30 17:15:19 crc kubenswrapper[4772]: I0930 17:15:19.102301 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f85gr" podStartSLOduration=2.629512706 podStartE2EDuration="5.102282603s" podCreationTimestamp="2025-09-30 17:15:14 +0000 UTC" firstStartedPulling="2025-09-30 17:15:16.032994586 +0000 UTC m=+816.940007417" lastFinishedPulling="2025-09-30 17:15:18.505764483 +0000 UTC m=+819.412777314" observedRunningTime="2025-09-30 17:15:19.09986704 +0000 UTC m=+820.006879871" watchObservedRunningTime="2025-09-30 17:15:19.102282603 +0000 UTC m=+820.009295434" Sep 30 17:15:19 crc kubenswrapper[4772]: I0930 17:15:19.116799 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lqxzw"] Sep 30 17:15:19 crc kubenswrapper[4772]: I0930 17:15:19.120326 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lqxzw"] Sep 30 17:15:19 crc kubenswrapper[4772]: I0930 17:15:19.127193 4772 scope.go:117] "RemoveContainer" containerID="5f3a50a575975e422e0fd6d17719a9646827a944fe86ed598728ee8dc2038b2e" Sep 30 17:15:19 crc kubenswrapper[4772]: I0930 17:15:19.908168 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" path="/var/lib/kubelet/pods/ea174d60-5a6c-4fbe-8edc-0b1b61f69549/volumes" Sep 30 17:15:24 crc kubenswrapper[4772]: I0930 17:15:24.572027 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:24 crc kubenswrapper[4772]: I0930 17:15:24.573576 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:24 crc kubenswrapper[4772]: I0930 17:15:24.610579 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.151408 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.836448 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx"] Sep 30 17:15:25 crc kubenswrapper[4772]: E0930 17:15:25.837281 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f52924-d141-4724-838f-d3bfd6dab358" containerName="extract" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.837390 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f52924-d141-4724-838f-d3bfd6dab358" containerName="extract" Sep 30 17:15:25 crc kubenswrapper[4772]: E0930 17:15:25.837456 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerName="extract-content" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.837506 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerName="extract-content" Sep 30 17:15:25 crc kubenswrapper[4772]: E0930 17:15:25.837564 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f52924-d141-4724-838f-d3bfd6dab358" containerName="pull" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.837614 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f52924-d141-4724-838f-d3bfd6dab358" containerName="pull" Sep 30 17:15:25 crc kubenswrapper[4772]: E0930 17:15:25.837664 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerName="extract-utilities" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.837714 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerName="extract-utilities" Sep 30 17:15:25 crc kubenswrapper[4772]: E0930 17:15:25.837770 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f52924-d141-4724-838f-d3bfd6dab358" containerName="util" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.837818 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f52924-d141-4724-838f-d3bfd6dab358" containerName="util" Sep 30 17:15:25 crc kubenswrapper[4772]: E0930 17:15:25.837869 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerName="registry-server" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.837942 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerName="registry-server" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.838143 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea174d60-5a6c-4fbe-8edc-0b1b61f69549" containerName="registry-server" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.838205 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4f52924-d141-4724-838f-d3bfd6dab358" containerName="extract" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.838679 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.841636 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.841665 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.842081 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.842306 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-whn6f" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.842350 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.870322 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tpp8r"] Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.871516 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.877674 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx"] Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.933577 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tpp8r"] Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.969253 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjlkm\" (UniqueName: \"kubernetes.io/projected/d8b0a4f0-a6d9-46ff-9487-98fec1d43e07-kube-api-access-hjlkm\") pod \"metallb-operator-controller-manager-cbbfcbbd-w9mxx\" (UID: \"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07\") " pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.969397 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-catalog-content\") pod \"certified-operators-tpp8r\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.969503 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-utilities\") pod \"certified-operators-tpp8r\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.969561 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4ckz\" (UniqueName: \"kubernetes.io/projected/133d5327-eb9f-4a07-bf40-3980530b2e14-kube-api-access-q4ckz\") pod \"certified-operators-tpp8r\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.969596 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8b0a4f0-a6d9-46ff-9487-98fec1d43e07-webhook-cert\") pod \"metallb-operator-controller-manager-cbbfcbbd-w9mxx\" (UID: \"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07\") " pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:25 crc kubenswrapper[4772]: I0930 17:15:25.969663 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8b0a4f0-a6d9-46ff-9487-98fec1d43e07-apiservice-cert\") pod \"metallb-operator-controller-manager-cbbfcbbd-w9mxx\" (UID: \"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07\") " pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.071350 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-utilities\") pod \"certified-operators-tpp8r\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.071441 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4ckz\" (UniqueName: \"kubernetes.io/projected/133d5327-eb9f-4a07-bf40-3980530b2e14-kube-api-access-q4ckz\") pod \"certified-operators-tpp8r\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.071469 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8b0a4f0-a6d9-46ff-9487-98fec1d43e07-webhook-cert\") pod \"metallb-operator-controller-manager-cbbfcbbd-w9mxx\" (UID: \"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07\") " pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.071494 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8b0a4f0-a6d9-46ff-9487-98fec1d43e07-apiservice-cert\") pod \"metallb-operator-controller-manager-cbbfcbbd-w9mxx\" (UID: \"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07\") " pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.071528 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjlkm\" (UniqueName: \"kubernetes.io/projected/d8b0a4f0-a6d9-46ff-9487-98fec1d43e07-kube-api-access-hjlkm\") pod \"metallb-operator-controller-manager-cbbfcbbd-w9mxx\" (UID: \"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07\") " pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.071561 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-catalog-content\") pod \"certified-operators-tpp8r\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.071884 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-utilities\") pod \"certified-operators-tpp8r\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.071999 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-catalog-content\") pod \"certified-operators-tpp8r\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.079820 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8b0a4f0-a6d9-46ff-9487-98fec1d43e07-webhook-cert\") pod \"metallb-operator-controller-manager-cbbfcbbd-w9mxx\" (UID: \"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07\") " pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.081526 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8b0a4f0-a6d9-46ff-9487-98fec1d43e07-apiservice-cert\") pod \"metallb-operator-controller-manager-cbbfcbbd-w9mxx\" (UID: \"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07\") " pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.090585 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjlkm\" (UniqueName: \"kubernetes.io/projected/d8b0a4f0-a6d9-46ff-9487-98fec1d43e07-kube-api-access-hjlkm\") pod \"metallb-operator-controller-manager-cbbfcbbd-w9mxx\" (UID: \"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07\") " pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.093662 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4ckz\" (UniqueName: \"kubernetes.io/projected/133d5327-eb9f-4a07-bf40-3980530b2e14-kube-api-access-q4ckz\") pod \"certified-operators-tpp8r\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.158021 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.181788 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp"] Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.182784 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.185981 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.194075 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.194290 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.194399 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-cqvz5" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.210163 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp"] Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.274655 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe6335cc-f638-4411-85e6-bf6beea1f24f-apiservice-cert\") pod \"metallb-operator-webhook-server-7f67d8696d-jl7tp\" (UID: \"fe6335cc-f638-4411-85e6-bf6beea1f24f\") " pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.274735 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe6335cc-f638-4411-85e6-bf6beea1f24f-webhook-cert\") pod \"metallb-operator-webhook-server-7f67d8696d-jl7tp\" (UID: \"fe6335cc-f638-4411-85e6-bf6beea1f24f\") " pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.274815 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxnw9\" (UniqueName: \"kubernetes.io/projected/fe6335cc-f638-4411-85e6-bf6beea1f24f-kube-api-access-xxnw9\") pod \"metallb-operator-webhook-server-7f67d8696d-jl7tp\" (UID: \"fe6335cc-f638-4411-85e6-bf6beea1f24f\") " pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.379965 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe6335cc-f638-4411-85e6-bf6beea1f24f-apiservice-cert\") pod \"metallb-operator-webhook-server-7f67d8696d-jl7tp\" (UID: \"fe6335cc-f638-4411-85e6-bf6beea1f24f\") " pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.380026 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe6335cc-f638-4411-85e6-bf6beea1f24f-webhook-cert\") pod \"metallb-operator-webhook-server-7f67d8696d-jl7tp\" (UID: \"fe6335cc-f638-4411-85e6-bf6beea1f24f\") " pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.380092 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxnw9\" (UniqueName: \"kubernetes.io/projected/fe6335cc-f638-4411-85e6-bf6beea1f24f-kube-api-access-xxnw9\") pod \"metallb-operator-webhook-server-7f67d8696d-jl7tp\" (UID: \"fe6335cc-f638-4411-85e6-bf6beea1f24f\") " pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.393881 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe6335cc-f638-4411-85e6-bf6beea1f24f-webhook-cert\") pod \"metallb-operator-webhook-server-7f67d8696d-jl7tp\" (UID: \"fe6335cc-f638-4411-85e6-bf6beea1f24f\") " pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.394089 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe6335cc-f638-4411-85e6-bf6beea1f24f-apiservice-cert\") pod \"metallb-operator-webhook-server-7f67d8696d-jl7tp\" (UID: \"fe6335cc-f638-4411-85e6-bf6beea1f24f\") " pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.410882 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxnw9\" (UniqueName: \"kubernetes.io/projected/fe6335cc-f638-4411-85e6-bf6beea1f24f-kube-api-access-xxnw9\") pod \"metallb-operator-webhook-server-7f67d8696d-jl7tp\" (UID: \"fe6335cc-f638-4411-85e6-bf6beea1f24f\") " pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.505524 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx"] Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.563897 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:26 crc kubenswrapper[4772]: I0930 17:15:26.796710 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tpp8r"] Sep 30 17:15:27 crc kubenswrapper[4772]: I0930 17:15:27.125896 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" event={"ID":"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07","Type":"ContainerStarted","Data":"26844fb93a8c851f329834244e88f3f73b25013b855442b088c73babbb8fce2b"} Sep 30 17:15:27 crc kubenswrapper[4772]: I0930 17:15:27.128442 4772 generic.go:334] "Generic (PLEG): container finished" podID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerID="b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a" exitCode=0 Sep 30 17:15:27 crc kubenswrapper[4772]: I0930 17:15:27.128536 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpp8r" event={"ID":"133d5327-eb9f-4a07-bf40-3980530b2e14","Type":"ContainerDied","Data":"b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a"} Sep 30 17:15:27 crc kubenswrapper[4772]: I0930 17:15:27.128592 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpp8r" event={"ID":"133d5327-eb9f-4a07-bf40-3980530b2e14","Type":"ContainerStarted","Data":"3ece090dffd7748c98ee833f44afcae1bba94c496a8029213590906bc9c34850"} Sep 30 17:15:27 crc kubenswrapper[4772]: I0930 17:15:27.197168 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp"] Sep 30 17:15:27 crc kubenswrapper[4772]: W0930 17:15:27.203292 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe6335cc_f638_4411_85e6_bf6beea1f24f.slice/crio-e73108f980e0fef71ddef5ac3e4c23a208da94eda236173684df48c1b0dbf617 WatchSource:0}: Error finding container e73108f980e0fef71ddef5ac3e4c23a208da94eda236173684df48c1b0dbf617: Status 404 returned error can't find the container with id e73108f980e0fef71ddef5ac3e4c23a208da94eda236173684df48c1b0dbf617 Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.141266 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpp8r" event={"ID":"133d5327-eb9f-4a07-bf40-3980530b2e14","Type":"ContainerStarted","Data":"b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b"} Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.156281 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" event={"ID":"fe6335cc-f638-4411-85e6-bf6beea1f24f","Type":"ContainerStarted","Data":"e73108f980e0fef71ddef5ac3e4c23a208da94eda236173684df48c1b0dbf617"} Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.231364 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f85gr"] Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.231564 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f85gr" podUID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerName="registry-server" containerID="cri-o://ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c" gracePeriod=2 Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.807423 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.922843 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjqn8\" (UniqueName: \"kubernetes.io/projected/8940bf5d-2be4-4de3-88b0-26455e2338b4-kube-api-access-zjqn8\") pod \"8940bf5d-2be4-4de3-88b0-26455e2338b4\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.923009 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-catalog-content\") pod \"8940bf5d-2be4-4de3-88b0-26455e2338b4\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.923122 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-utilities\") pod \"8940bf5d-2be4-4de3-88b0-26455e2338b4\" (UID: \"8940bf5d-2be4-4de3-88b0-26455e2338b4\") " Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.924761 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-utilities" (OuterVolumeSpecName: "utilities") pod "8940bf5d-2be4-4de3-88b0-26455e2338b4" (UID: "8940bf5d-2be4-4de3-88b0-26455e2338b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.932191 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8940bf5d-2be4-4de3-88b0-26455e2338b4-kube-api-access-zjqn8" (OuterVolumeSpecName: "kube-api-access-zjqn8") pod "8940bf5d-2be4-4de3-88b0-26455e2338b4" (UID: "8940bf5d-2be4-4de3-88b0-26455e2338b4"). InnerVolumeSpecName "kube-api-access-zjqn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:15:28 crc kubenswrapper[4772]: I0930 17:15:28.941322 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8940bf5d-2be4-4de3-88b0-26455e2338b4" (UID: "8940bf5d-2be4-4de3-88b0-26455e2338b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.026856 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.026890 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjqn8\" (UniqueName: \"kubernetes.io/projected/8940bf5d-2be4-4de3-88b0-26455e2338b4-kube-api-access-zjqn8\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.026903 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8940bf5d-2be4-4de3-88b0-26455e2338b4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.172745 4772 generic.go:334] "Generic (PLEG): container finished" podID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerID="b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b" exitCode=0 Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.172803 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpp8r" event={"ID":"133d5327-eb9f-4a07-bf40-3980530b2e14","Type":"ContainerDied","Data":"b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b"} Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.179544 4772 generic.go:334] "Generic (PLEG): container finished" podID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerID="ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c" exitCode=0 Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.179586 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f85gr" event={"ID":"8940bf5d-2be4-4de3-88b0-26455e2338b4","Type":"ContainerDied","Data":"ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c"} Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.179666 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f85gr" event={"ID":"8940bf5d-2be4-4de3-88b0-26455e2338b4","Type":"ContainerDied","Data":"605e9f68938630614c04465a9660693ffdbcebef063eab42c3beea7c312fc0e4"} Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.179722 4772 scope.go:117] "RemoveContainer" containerID="ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.179901 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f85gr" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.213482 4772 scope.go:117] "RemoveContainer" containerID="74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.230517 4772 scope.go:117] "RemoveContainer" containerID="d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.234813 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f85gr"] Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.239349 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f85gr"] Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.247618 4772 scope.go:117] "RemoveContainer" containerID="ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c" Sep 30 17:15:29 crc kubenswrapper[4772]: E0930 17:15:29.249115 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c\": container with ID starting with ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c not found: ID does not exist" containerID="ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.249152 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c"} err="failed to get container status \"ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c\": rpc error: code = NotFound desc = could not find container \"ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c\": container with ID starting with ca1ebf9b89913a482e5185aa8fcf006968293eb9e4b83e1ef7ee760b64a2b40c not found: ID does not exist" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.249176 4772 scope.go:117] "RemoveContainer" containerID="74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34" Sep 30 17:15:29 crc kubenswrapper[4772]: E0930 17:15:29.249490 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34\": container with ID starting with 74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34 not found: ID does not exist" containerID="74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.249525 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34"} err="failed to get container status \"74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34\": rpc error: code = NotFound desc = could not find container \"74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34\": container with ID starting with 74b5d32d77218b84989d688b0b2904360479c1f4a238a9b688101ef9845cdd34 not found: ID does not exist" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.249538 4772 scope.go:117] "RemoveContainer" containerID="d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce" Sep 30 17:15:29 crc kubenswrapper[4772]: E0930 17:15:29.249801 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce\": container with ID starting with d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce not found: ID does not exist" containerID="d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.249823 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce"} err="failed to get container status \"d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce\": rpc error: code = NotFound desc = could not find container \"d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce\": container with ID starting with d435b626e30a3b69d2ad9f7209974d8d55fc79fb0b6136eab937c2ede51ca6ce not found: ID does not exist" Sep 30 17:15:29 crc kubenswrapper[4772]: I0930 17:15:29.907334 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8940bf5d-2be4-4de3-88b0-26455e2338b4" path="/var/lib/kubelet/pods/8940bf5d-2be4-4de3-88b0-26455e2338b4/volumes" Sep 30 17:15:32 crc kubenswrapper[4772]: I0930 17:15:32.205348 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" event={"ID":"d8b0a4f0-a6d9-46ff-9487-98fec1d43e07","Type":"ContainerStarted","Data":"c6ad8f12feb66c773ba027833906287f3c48c2bb17ae7c2425c3fed7084c8671"} Sep 30 17:15:32 crc kubenswrapper[4772]: I0930 17:15:32.206166 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:15:32 crc kubenswrapper[4772]: I0930 17:15:32.208664 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpp8r" event={"ID":"133d5327-eb9f-4a07-bf40-3980530b2e14","Type":"ContainerStarted","Data":"6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3"} Sep 30 17:15:32 crc kubenswrapper[4772]: I0930 17:15:32.210713 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" event={"ID":"fe6335cc-f638-4411-85e6-bf6beea1f24f","Type":"ContainerStarted","Data":"d0df688ed613481bb7c1383a9460e25a00a27e8b4348940e4c0a759d0cbf0ed2"} Sep 30 17:15:32 crc kubenswrapper[4772]: I0930 17:15:32.210860 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:32 crc kubenswrapper[4772]: I0930 17:15:32.232814 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" podStartSLOduration=1.883555528 podStartE2EDuration="7.232786389s" podCreationTimestamp="2025-09-30 17:15:25 +0000 UTC" firstStartedPulling="2025-09-30 17:15:26.533224456 +0000 UTC m=+827.440237287" lastFinishedPulling="2025-09-30 17:15:31.882455317 +0000 UTC m=+832.789468148" observedRunningTime="2025-09-30 17:15:32.228167518 +0000 UTC m=+833.135180369" watchObservedRunningTime="2025-09-30 17:15:32.232786389 +0000 UTC m=+833.139799230" Sep 30 17:15:32 crc kubenswrapper[4772]: I0930 17:15:32.249617 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tpp8r" podStartSLOduration=2.497952836 podStartE2EDuration="7.249597539s" podCreationTimestamp="2025-09-30 17:15:25 +0000 UTC" firstStartedPulling="2025-09-30 17:15:27.129984492 +0000 UTC m=+828.036997323" lastFinishedPulling="2025-09-30 17:15:31.881629195 +0000 UTC m=+832.788642026" observedRunningTime="2025-09-30 17:15:32.245678096 +0000 UTC m=+833.152690917" watchObservedRunningTime="2025-09-30 17:15:32.249597539 +0000 UTC m=+833.156610370" Sep 30 17:15:32 crc kubenswrapper[4772]: I0930 17:15:32.269004 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" podStartSLOduration=1.490136031 podStartE2EDuration="6.268987986s" podCreationTimestamp="2025-09-30 17:15:26 +0000 UTC" firstStartedPulling="2025-09-30 17:15:27.204926502 +0000 UTC m=+828.111939333" lastFinishedPulling="2025-09-30 17:15:31.983778457 +0000 UTC m=+832.890791288" observedRunningTime="2025-09-30 17:15:32.263230965 +0000 UTC m=+833.170243796" watchObservedRunningTime="2025-09-30 17:15:32.268987986 +0000 UTC m=+833.176000817" Sep 30 17:15:36 crc kubenswrapper[4772]: I0930 17:15:36.186629 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:36 crc kubenswrapper[4772]: I0930 17:15:36.186947 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:36 crc kubenswrapper[4772]: I0930 17:15:36.225798 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:46 crc kubenswrapper[4772]: I0930 17:15:46.236775 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:46 crc kubenswrapper[4772]: I0930 17:15:46.574734 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7f67d8696d-jl7tp" Sep 30 17:15:48 crc kubenswrapper[4772]: I0930 17:15:48.633166 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tpp8r"] Sep 30 17:15:48 crc kubenswrapper[4772]: I0930 17:15:48.633414 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tpp8r" podUID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerName="registry-server" containerID="cri-o://6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3" gracePeriod=2 Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.146712 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.223011 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4ckz\" (UniqueName: \"kubernetes.io/projected/133d5327-eb9f-4a07-bf40-3980530b2e14-kube-api-access-q4ckz\") pod \"133d5327-eb9f-4a07-bf40-3980530b2e14\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.223132 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-catalog-content\") pod \"133d5327-eb9f-4a07-bf40-3980530b2e14\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.223188 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-utilities\") pod \"133d5327-eb9f-4a07-bf40-3980530b2e14\" (UID: \"133d5327-eb9f-4a07-bf40-3980530b2e14\") " Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.224109 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-utilities" (OuterVolumeSpecName: "utilities") pod "133d5327-eb9f-4a07-bf40-3980530b2e14" (UID: "133d5327-eb9f-4a07-bf40-3980530b2e14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.231465 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133d5327-eb9f-4a07-bf40-3980530b2e14-kube-api-access-q4ckz" (OuterVolumeSpecName: "kube-api-access-q4ckz") pod "133d5327-eb9f-4a07-bf40-3980530b2e14" (UID: "133d5327-eb9f-4a07-bf40-3980530b2e14"). InnerVolumeSpecName "kube-api-access-q4ckz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.262477 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "133d5327-eb9f-4a07-bf40-3980530b2e14" (UID: "133d5327-eb9f-4a07-bf40-3980530b2e14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.321530 4772 generic.go:334] "Generic (PLEG): container finished" podID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerID="6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3" exitCode=0 Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.321590 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpp8r" event={"ID":"133d5327-eb9f-4a07-bf40-3980530b2e14","Type":"ContainerDied","Data":"6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3"} Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.321629 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tpp8r" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.321651 4772 scope.go:117] "RemoveContainer" containerID="6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.321635 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpp8r" event={"ID":"133d5327-eb9f-4a07-bf40-3980530b2e14","Type":"ContainerDied","Data":"3ece090dffd7748c98ee833f44afcae1bba94c496a8029213590906bc9c34850"} Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.325358 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.325393 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133d5327-eb9f-4a07-bf40-3980530b2e14-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.325413 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4ckz\" (UniqueName: \"kubernetes.io/projected/133d5327-eb9f-4a07-bf40-3980530b2e14-kube-api-access-q4ckz\") on node \"crc\" DevicePath \"\"" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.338905 4772 scope.go:117] "RemoveContainer" containerID="b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.358841 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tpp8r"] Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.364430 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tpp8r"] Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.390232 4772 scope.go:117] "RemoveContainer" containerID="b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.407424 4772 scope.go:117] "RemoveContainer" containerID="6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3" Sep 30 17:15:49 crc kubenswrapper[4772]: E0930 17:15:49.407847 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3\": container with ID starting with 6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3 not found: ID does not exist" containerID="6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.407880 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3"} err="failed to get container status \"6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3\": rpc error: code = NotFound desc = could not find container \"6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3\": container with ID starting with 6330da3ecaa2d334842f37fa68eeb27649429875337f42eefd903b0229d518d3 not found: ID does not exist" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.407901 4772 scope.go:117] "RemoveContainer" containerID="b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b" Sep 30 17:15:49 crc kubenswrapper[4772]: E0930 17:15:49.408409 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b\": container with ID starting with b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b not found: ID does not exist" containerID="b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.408432 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b"} err="failed to get container status \"b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b\": rpc error: code = NotFound desc = could not find container \"b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b\": container with ID starting with b337b9862f81200183a0e5fcb06b5b13bc092dabb325566f429dd6c44d411f3b not found: ID does not exist" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.408445 4772 scope.go:117] "RemoveContainer" containerID="b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a" Sep 30 17:15:49 crc kubenswrapper[4772]: E0930 17:15:49.408702 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a\": container with ID starting with b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a not found: ID does not exist" containerID="b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.408723 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a"} err="failed to get container status \"b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a\": rpc error: code = NotFound desc = could not find container \"b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a\": container with ID starting with b4e45d90761a26d2ad0a8bc8c5be1fcebe57fa5f2c25b0c93005c7962679e13a not found: ID does not exist" Sep 30 17:15:49 crc kubenswrapper[4772]: I0930 17:15:49.906610 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="133d5327-eb9f-4a07-bf40-3980530b2e14" path="/var/lib/kubelet/pods/133d5327-eb9f-4a07-bf40-3980530b2e14/volumes" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.161152 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-cbbfcbbd-w9mxx" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.934417 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-f68tb"] Sep 30 17:16:06 crc kubenswrapper[4772]: E0930 17:16:06.934818 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerName="registry-server" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.934845 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerName="registry-server" Sep 30 17:16:06 crc kubenswrapper[4772]: E0930 17:16:06.934869 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerName="extract-content" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.934882 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerName="extract-content" Sep 30 17:16:06 crc kubenswrapper[4772]: E0930 17:16:06.934895 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerName="extract-content" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.934907 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerName="extract-content" Sep 30 17:16:06 crc kubenswrapper[4772]: E0930 17:16:06.934939 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerName="registry-server" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.934951 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerName="registry-server" Sep 30 17:16:06 crc kubenswrapper[4772]: E0930 17:16:06.934971 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerName="extract-utilities" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.934983 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerName="extract-utilities" Sep 30 17:16:06 crc kubenswrapper[4772]: E0930 17:16:06.935024 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerName="extract-utilities" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.935038 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerName="extract-utilities" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.935337 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="133d5327-eb9f-4a07-bf40-3980530b2e14" containerName="registry-server" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.935369 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8940bf5d-2be4-4de3-88b0-26455e2338b4" containerName="registry-server" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.943530 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.947670 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.947700 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-b5qwh" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.949329 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.957158 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l"] Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.959441 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.962322 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 17:16:06 crc kubenswrapper[4772]: I0930 17:16:06.988422 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l"] Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.088096 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-kd7pn"] Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.089244 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.094668 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-pbhmq"] Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.094690 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.094735 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-l75dr" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.095858 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9jsh\" (UniqueName: \"kubernetes.io/projected/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-kube-api-access-r9jsh\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.095904 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnmz\" (UniqueName: \"kubernetes.io/projected/90506d56-68ff-4821-9594-0bfaa2ef2b57-kube-api-access-rjnmz\") pod \"frr-k8s-webhook-server-5478bdb765-q2g5l\" (UID: \"90506d56-68ff-4821-9594-0bfaa2ef2b57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.095940 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-frr-conf\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.095964 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-reloader\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.095993 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-frr-sockets\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.096014 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90506d56-68ff-4821-9594-0bfaa2ef2b57-cert\") pod \"frr-k8s-webhook-server-5478bdb765-q2g5l\" (UID: \"90506d56-68ff-4821-9594-0bfaa2ef2b57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.096037 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-frr-startup\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.096014 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.098830 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-metrics-certs\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.098887 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-metrics\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.099009 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.099109 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 17:16:07 crc kubenswrapper[4772]: W0930 17:16:07.099121 4772 reflector.go:561] object-"metallb-system"/"controller-certs-secret": failed to list *v1.Secret: secrets "controller-certs-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Sep 30 17:16:07 crc kubenswrapper[4772]: E0930 17:16:07.099150 4772 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-certs-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-certs-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.134223 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-pbhmq"] Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201336 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-frr-conf\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201380 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-reloader\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201403 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vlk\" (UniqueName: \"kubernetes.io/projected/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-kube-api-access-68vlk\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201431 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpk8t\" (UniqueName: \"kubernetes.io/projected/a7c392dd-0528-44c0-8fa6-85d8c33a4ac4-kube-api-access-tpk8t\") pod \"controller-5d688f5ffc-pbhmq\" (UID: \"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4\") " pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201447 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-metallb-excludel2\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201466 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-frr-sockets\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201482 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201498 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7c392dd-0528-44c0-8fa6-85d8c33a4ac4-cert\") pod \"controller-5d688f5ffc-pbhmq\" (UID: \"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4\") " pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201513 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90506d56-68ff-4821-9594-0bfaa2ef2b57-cert\") pod \"frr-k8s-webhook-server-5478bdb765-q2g5l\" (UID: \"90506d56-68ff-4821-9594-0bfaa2ef2b57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201530 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-frr-startup\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201552 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-metrics-certs\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201579 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-metrics-certs\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201598 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-metrics\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201621 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c392dd-0528-44c0-8fa6-85d8c33a4ac4-metrics-certs\") pod \"controller-5d688f5ffc-pbhmq\" (UID: \"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4\") " pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201643 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9jsh\" (UniqueName: \"kubernetes.io/projected/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-kube-api-access-r9jsh\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201668 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnmz\" (UniqueName: \"kubernetes.io/projected/90506d56-68ff-4821-9594-0bfaa2ef2b57-kube-api-access-rjnmz\") pod \"frr-k8s-webhook-server-5478bdb765-q2g5l\" (UID: \"90506d56-68ff-4821-9594-0bfaa2ef2b57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.201831 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-frr-conf\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: E0930 17:16:07.201956 4772 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Sep 30 17:16:07 crc kubenswrapper[4772]: E0930 17:16:07.202006 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90506d56-68ff-4821-9594-0bfaa2ef2b57-cert podName:90506d56-68ff-4821-9594-0bfaa2ef2b57 nodeName:}" failed. No retries permitted until 2025-09-30 17:16:07.701985311 +0000 UTC m=+868.608998142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/90506d56-68ff-4821-9594-0bfaa2ef2b57-cert") pod "frr-k8s-webhook-server-5478bdb765-q2g5l" (UID: "90506d56-68ff-4821-9594-0bfaa2ef2b57") : secret "frr-k8s-webhook-server-cert" not found Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.202211 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-frr-sockets\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.202514 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-reloader\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: E0930 17:16:07.202603 4772 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Sep 30 17:16:07 crc kubenswrapper[4772]: E0930 17:16:07.202652 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-metrics-certs podName:dbe8dedf-164d-43b2-9b38-4abcae7fb3e5 nodeName:}" failed. No retries permitted until 2025-09-30 17:16:07.702632668 +0000 UTC m=+868.609645499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-metrics-certs") pod "frr-k8s-f68tb" (UID: "dbe8dedf-164d-43b2-9b38-4abcae7fb3e5") : secret "frr-k8s-certs-secret" not found Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.202800 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-metrics\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.203155 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-frr-startup\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.241081 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9jsh\" (UniqueName: \"kubernetes.io/projected/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-kube-api-access-r9jsh\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.241836 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnmz\" (UniqueName: \"kubernetes.io/projected/90506d56-68ff-4821-9594-0bfaa2ef2b57-kube-api-access-rjnmz\") pod \"frr-k8s-webhook-server-5478bdb765-q2g5l\" (UID: \"90506d56-68ff-4821-9594-0bfaa2ef2b57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.302565 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vlk\" (UniqueName: \"kubernetes.io/projected/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-kube-api-access-68vlk\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.302614 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpk8t\" (UniqueName: \"kubernetes.io/projected/a7c392dd-0528-44c0-8fa6-85d8c33a4ac4-kube-api-access-tpk8t\") pod \"controller-5d688f5ffc-pbhmq\" (UID: \"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4\") " pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.302632 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-metallb-excludel2\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.302656 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.302675 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7c392dd-0528-44c0-8fa6-85d8c33a4ac4-cert\") pod \"controller-5d688f5ffc-pbhmq\" (UID: \"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4\") " pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.302707 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-metrics-certs\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.302752 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c392dd-0528-44c0-8fa6-85d8c33a4ac4-metrics-certs\") pod \"controller-5d688f5ffc-pbhmq\" (UID: \"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4\") " pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:07 crc kubenswrapper[4772]: E0930 17:16:07.302997 4772 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 17:16:07 crc kubenswrapper[4772]: E0930 17:16:07.303142 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist podName:d1d9e7ba-297f-4ef1-913a-afb210b83c2a nodeName:}" failed. No retries permitted until 2025-09-30 17:16:07.803109623 +0000 UTC m=+868.710122454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist") pod "speaker-kd7pn" (UID: "d1d9e7ba-297f-4ef1-913a-afb210b83c2a") : secret "metallb-memberlist" not found Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.303558 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-metallb-excludel2\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.307567 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-metrics-certs\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.307817 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.317523 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7c392dd-0528-44c0-8fa6-85d8c33a4ac4-cert\") pod \"controller-5d688f5ffc-pbhmq\" (UID: \"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4\") " pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.327099 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vlk\" (UniqueName: \"kubernetes.io/projected/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-kube-api-access-68vlk\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.328751 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpk8t\" (UniqueName: \"kubernetes.io/projected/a7c392dd-0528-44c0-8fa6-85d8c33a4ac4-kube-api-access-tpk8t\") pod \"controller-5d688f5ffc-pbhmq\" (UID: \"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4\") " pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.710521 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90506d56-68ff-4821-9594-0bfaa2ef2b57-cert\") pod \"frr-k8s-webhook-server-5478bdb765-q2g5l\" (UID: \"90506d56-68ff-4821-9594-0bfaa2ef2b57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.710609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-metrics-certs\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.714178 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbe8dedf-164d-43b2-9b38-4abcae7fb3e5-metrics-certs\") pod \"frr-k8s-f68tb\" (UID: \"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5\") " pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.715215 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90506d56-68ff-4821-9594-0bfaa2ef2b57-cert\") pod \"frr-k8s-webhook-server-5478bdb765-q2g5l\" (UID: \"90506d56-68ff-4821-9594-0bfaa2ef2b57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.811292 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:07 crc kubenswrapper[4772]: E0930 17:16:07.811480 4772 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 17:16:07 crc kubenswrapper[4772]: E0930 17:16:07.811553 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist podName:d1d9e7ba-297f-4ef1-913a-afb210b83c2a nodeName:}" failed. No retries permitted until 2025-09-30 17:16:08.811534041 +0000 UTC m=+869.718546872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist") pod "speaker-kd7pn" (UID: "d1d9e7ba-297f-4ef1-913a-afb210b83c2a") : secret "metallb-memberlist" not found Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.862355 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:07 crc kubenswrapper[4772]: I0930 17:16:07.876011 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:08 crc kubenswrapper[4772]: I0930 17:16:08.292191 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l"] Sep 30 17:16:08 crc kubenswrapper[4772]: I0930 17:16:08.296485 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 17:16:08 crc kubenswrapper[4772]: I0930 17:16:08.312473 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c392dd-0528-44c0-8fa6-85d8c33a4ac4-metrics-certs\") pod \"controller-5d688f5ffc-pbhmq\" (UID: \"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4\") " pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:08 crc kubenswrapper[4772]: I0930 17:16:08.320215 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:08 crc kubenswrapper[4772]: I0930 17:16:08.454457 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" event={"ID":"90506d56-68ff-4821-9594-0bfaa2ef2b57","Type":"ContainerStarted","Data":"5ded90329eb228e4628ed2328bff6f1f6eda9a1f3a57a8a8425f7aaa532f49e1"} Sep 30 17:16:08 crc kubenswrapper[4772]: I0930 17:16:08.455433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f68tb" event={"ID":"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5","Type":"ContainerStarted","Data":"4c53e4439e1d89290f98510eea29ccc0c2fb11a8d6dc9503951e97ece1f09408"} Sep 30 17:16:08 crc kubenswrapper[4772]: I0930 17:16:08.715289 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-pbhmq"] Sep 30 17:16:08 crc kubenswrapper[4772]: W0930 17:16:08.723254 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c392dd_0528_44c0_8fa6_85d8c33a4ac4.slice/crio-3900cc5dc11b2e02e898dd6eec053fe2c10fd9cf8c1aa0c8866d6d9fa4860e81 WatchSource:0}: Error finding container 3900cc5dc11b2e02e898dd6eec053fe2c10fd9cf8c1aa0c8866d6d9fa4860e81: Status 404 returned error can't find the container with id 3900cc5dc11b2e02e898dd6eec053fe2c10fd9cf8c1aa0c8866d6d9fa4860e81 Sep 30 17:16:08 crc kubenswrapper[4772]: I0930 17:16:08.822873 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:08 crc kubenswrapper[4772]: E0930 17:16:08.823179 4772 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 17:16:08 crc kubenswrapper[4772]: E0930 17:16:08.823376 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist podName:d1d9e7ba-297f-4ef1-913a-afb210b83c2a nodeName:}" failed. No retries permitted until 2025-09-30 17:16:10.823348777 +0000 UTC m=+871.730361608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist") pod "speaker-kd7pn" (UID: "d1d9e7ba-297f-4ef1-913a-afb210b83c2a") : secret "metallb-memberlist" not found Sep 30 17:16:09 crc kubenswrapper[4772]: I0930 17:16:09.472304 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-pbhmq" event={"ID":"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4","Type":"ContainerStarted","Data":"cafc848e5c154fee72ea4565e4ea67609e651ab4f9392e6bcbe40c3f423368bf"} Sep 30 17:16:09 crc kubenswrapper[4772]: I0930 17:16:09.472352 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-pbhmq" event={"ID":"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4","Type":"ContainerStarted","Data":"163ac9994a7de4df2efd9d10dbe84bc89b0a26b3154c5a581f0d86087ac7a5fb"} Sep 30 17:16:09 crc kubenswrapper[4772]: I0930 17:16:09.472363 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-pbhmq" event={"ID":"a7c392dd-0528-44c0-8fa6-85d8c33a4ac4","Type":"ContainerStarted","Data":"3900cc5dc11b2e02e898dd6eec053fe2c10fd9cf8c1aa0c8866d6d9fa4860e81"} Sep 30 17:16:09 crc kubenswrapper[4772]: I0930 17:16:09.472522 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:09 crc kubenswrapper[4772]: I0930 17:16:09.498114 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-pbhmq" podStartSLOduration=2.498087192 podStartE2EDuration="2.498087192s" podCreationTimestamp="2025-09-30 17:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:16:09.496287196 +0000 UTC m=+870.403300027" watchObservedRunningTime="2025-09-30 17:16:09.498087192 +0000 UTC m=+870.405100023" Sep 30 17:16:10 crc kubenswrapper[4772]: I0930 17:16:10.858834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:10 crc kubenswrapper[4772]: I0930 17:16:10.865693 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1d9e7ba-297f-4ef1-913a-afb210b83c2a-memberlist\") pod \"speaker-kd7pn\" (UID: \"d1d9e7ba-297f-4ef1-913a-afb210b83c2a\") " pod="metallb-system/speaker-kd7pn" Sep 30 17:16:11 crc kubenswrapper[4772]: I0930 17:16:11.008514 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kd7pn" Sep 30 17:16:11 crc kubenswrapper[4772]: W0930 17:16:11.037381 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d9e7ba_297f_4ef1_913a_afb210b83c2a.slice/crio-76d7609145e0d6733259c638340b0e893262218460c11db5d1d94b2465b41ae0 WatchSource:0}: Error finding container 76d7609145e0d6733259c638340b0e893262218460c11db5d1d94b2465b41ae0: Status 404 returned error can't find the container with id 76d7609145e0d6733259c638340b0e893262218460c11db5d1d94b2465b41ae0 Sep 30 17:16:11 crc kubenswrapper[4772]: I0930 17:16:11.488505 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kd7pn" event={"ID":"d1d9e7ba-297f-4ef1-913a-afb210b83c2a","Type":"ContainerStarted","Data":"18035469d780ee8366944ec5cc7bee43ec920ce974fa3081c3d23c8779cdc4a7"} Sep 30 17:16:11 crc kubenswrapper[4772]: I0930 17:16:11.489010 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kd7pn" event={"ID":"d1d9e7ba-297f-4ef1-913a-afb210b83c2a","Type":"ContainerStarted","Data":"76d7609145e0d6733259c638340b0e893262218460c11db5d1d94b2465b41ae0"} Sep 30 17:16:12 crc kubenswrapper[4772]: I0930 17:16:12.511505 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kd7pn" event={"ID":"d1d9e7ba-297f-4ef1-913a-afb210b83c2a","Type":"ContainerStarted","Data":"ee0aa500f8e96f29befc4fdb30fc9c9ad3a4e1be2421314a46cdd1fd086f00f7"} Sep 30 17:16:12 crc kubenswrapper[4772]: I0930 17:16:12.512813 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-kd7pn" Sep 30 17:16:12 crc kubenswrapper[4772]: I0930 17:16:12.535227 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-kd7pn" podStartSLOduration=5.535209621 podStartE2EDuration="5.535209621s" podCreationTimestamp="2025-09-30 17:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:16:12.528549689 +0000 UTC m=+873.435562540" watchObservedRunningTime="2025-09-30 17:16:12.535209621 +0000 UTC m=+873.442222452" Sep 30 17:16:15 crc kubenswrapper[4772]: I0930 17:16:15.531594 4772 generic.go:334] "Generic (PLEG): container finished" podID="dbe8dedf-164d-43b2-9b38-4abcae7fb3e5" containerID="94970aad99acb76223527eb6da33c29dfa6bbe206c52a8a681f6bd703db4a096" exitCode=0 Sep 30 17:16:15 crc kubenswrapper[4772]: I0930 17:16:15.531717 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f68tb" event={"ID":"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5","Type":"ContainerDied","Data":"94970aad99acb76223527eb6da33c29dfa6bbe206c52a8a681f6bd703db4a096"} Sep 30 17:16:15 crc kubenswrapper[4772]: I0930 17:16:15.533904 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" event={"ID":"90506d56-68ff-4821-9594-0bfaa2ef2b57","Type":"ContainerStarted","Data":"4cc949ad1b506b4f78d42e3241c07a17c2e32b65b3a25bcef0ecf0d32f188e9c"} Sep 30 17:16:15 crc kubenswrapper[4772]: I0930 17:16:15.534078 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:15 crc kubenswrapper[4772]: I0930 17:16:15.583255 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" podStartSLOduration=2.840467639 podStartE2EDuration="9.583234832s" podCreationTimestamp="2025-09-30 17:16:06 +0000 UTC" firstStartedPulling="2025-09-30 17:16:08.302518278 +0000 UTC m=+869.209531109" lastFinishedPulling="2025-09-30 17:16:15.045285471 +0000 UTC m=+875.952298302" observedRunningTime="2025-09-30 17:16:15.582919594 +0000 UTC m=+876.489932455" watchObservedRunningTime="2025-09-30 17:16:15.583234832 +0000 UTC m=+876.490247673" Sep 30 17:16:16 crc kubenswrapper[4772]: I0930 17:16:16.544497 4772 generic.go:334] "Generic (PLEG): container finished" podID="dbe8dedf-164d-43b2-9b38-4abcae7fb3e5" containerID="8f386b4dd25a2efcb0d09c0fbf0ce395feac4fb77790927069b2a882ebfa1d0b" exitCode=0 Sep 30 17:16:16 crc kubenswrapper[4772]: I0930 17:16:16.544594 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f68tb" event={"ID":"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5","Type":"ContainerDied","Data":"8f386b4dd25a2efcb0d09c0fbf0ce395feac4fb77790927069b2a882ebfa1d0b"} Sep 30 17:16:17 crc kubenswrapper[4772]: I0930 17:16:17.553863 4772 generic.go:334] "Generic (PLEG): container finished" podID="dbe8dedf-164d-43b2-9b38-4abcae7fb3e5" containerID="0896468025ab24c711d5a94990b7481d58fcf83b0b03f9f6dae5e15ff177a7c5" exitCode=0 Sep 30 17:16:17 crc kubenswrapper[4772]: I0930 17:16:17.553976 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f68tb" event={"ID":"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5","Type":"ContainerDied","Data":"0896468025ab24c711d5a94990b7481d58fcf83b0b03f9f6dae5e15ff177a7c5"} Sep 30 17:16:18 crc kubenswrapper[4772]: I0930 17:16:18.324716 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-pbhmq" Sep 30 17:16:18 crc kubenswrapper[4772]: I0930 17:16:18.569884 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f68tb" event={"ID":"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5","Type":"ContainerStarted","Data":"07ce55340152b2f1ff028837d0e432f21a008947b800d55eb9029167d8805c0b"} Sep 30 17:16:18 crc kubenswrapper[4772]: I0930 17:16:18.569932 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f68tb" event={"ID":"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5","Type":"ContainerStarted","Data":"0223388f00c6bef8d85953ce8649463fab20349673e21bdf03a57dfe33e85cb2"} Sep 30 17:16:18 crc kubenswrapper[4772]: I0930 17:16:18.569946 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f68tb" event={"ID":"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5","Type":"ContainerStarted","Data":"d28bdae5631d726ffc1e9689a5a09c01ea71a7a4ae31fc521a136309b4fad866"} Sep 30 17:16:18 crc kubenswrapper[4772]: I0930 17:16:18.569958 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f68tb" event={"ID":"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5","Type":"ContainerStarted","Data":"cd2f78011d635399bbbb79c3287f1e3aefcf15f9bd2f9f58a01fb973ff0bde4b"} Sep 30 17:16:18 crc kubenswrapper[4772]: I0930 17:16:18.569972 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f68tb" event={"ID":"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5","Type":"ContainerStarted","Data":"ee5e83b4996318e16869a4cea43a3736cd3eef1b70b5de8f4ca7175bc0a35746"} Sep 30 17:16:19 crc kubenswrapper[4772]: I0930 17:16:19.580792 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f68tb" event={"ID":"dbe8dedf-164d-43b2-9b38-4abcae7fb3e5","Type":"ContainerStarted","Data":"8356b13d4b06119c8294dce3a4ffe061abadb88e5bf46baf499432aad0bc9013"} Sep 30 17:16:19 crc kubenswrapper[4772]: I0930 17:16:19.581208 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:19 crc kubenswrapper[4772]: I0930 17:16:19.603224 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-f68tb" podStartSLOduration=6.607542821 podStartE2EDuration="13.603196638s" podCreationTimestamp="2025-09-30 17:16:06 +0000 UTC" firstStartedPulling="2025-09-30 17:16:08.016706123 +0000 UTC m=+868.923718954" lastFinishedPulling="2025-09-30 17:16:15.01235994 +0000 UTC m=+875.919372771" observedRunningTime="2025-09-30 17:16:19.60251593 +0000 UTC m=+880.509528771" watchObservedRunningTime="2025-09-30 17:16:19.603196638 +0000 UTC m=+880.510209499" Sep 30 17:16:21 crc kubenswrapper[4772]: I0930 17:16:21.012653 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-kd7pn" Sep 30 17:16:22 crc kubenswrapper[4772]: I0930 17:16:22.863393 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:22 crc kubenswrapper[4772]: I0930 17:16:22.911367 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:23 crc kubenswrapper[4772]: I0930 17:16:23.842920 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5fjbx"] Sep 30 17:16:23 crc kubenswrapper[4772]: I0930 17:16:23.844364 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5fjbx" Sep 30 17:16:23 crc kubenswrapper[4772]: I0930 17:16:23.848309 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 17:16:23 crc kubenswrapper[4772]: I0930 17:16:23.848382 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 17:16:23 crc kubenswrapper[4772]: I0930 17:16:23.849108 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-b9lnk" Sep 30 17:16:23 crc kubenswrapper[4772]: I0930 17:16:23.855039 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5fjbx"] Sep 30 17:16:23 crc kubenswrapper[4772]: I0930 17:16:23.952009 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwmgq\" (UniqueName: \"kubernetes.io/projected/16758c63-4fab-4952-ab4f-765b9f41a1c3-kube-api-access-qwmgq\") pod \"openstack-operator-index-5fjbx\" (UID: \"16758c63-4fab-4952-ab4f-765b9f41a1c3\") " pod="openstack-operators/openstack-operator-index-5fjbx" Sep 30 17:16:24 crc kubenswrapper[4772]: I0930 17:16:24.053510 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwmgq\" (UniqueName: \"kubernetes.io/projected/16758c63-4fab-4952-ab4f-765b9f41a1c3-kube-api-access-qwmgq\") pod \"openstack-operator-index-5fjbx\" (UID: \"16758c63-4fab-4952-ab4f-765b9f41a1c3\") " pod="openstack-operators/openstack-operator-index-5fjbx" Sep 30 17:16:24 crc kubenswrapper[4772]: I0930 17:16:24.073326 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwmgq\" (UniqueName: \"kubernetes.io/projected/16758c63-4fab-4952-ab4f-765b9f41a1c3-kube-api-access-qwmgq\") pod \"openstack-operator-index-5fjbx\" (UID: \"16758c63-4fab-4952-ab4f-765b9f41a1c3\") " pod="openstack-operators/openstack-operator-index-5fjbx" Sep 30 17:16:24 crc kubenswrapper[4772]: I0930 17:16:24.164833 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5fjbx" Sep 30 17:16:24 crc kubenswrapper[4772]: I0930 17:16:24.589487 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5fjbx"] Sep 30 17:16:24 crc kubenswrapper[4772]: W0930 17:16:24.600084 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16758c63_4fab_4952_ab4f_765b9f41a1c3.slice/crio-4c4992bcff986cfc2e479c67b0358077f069c56250c1e063c292f2c3e7fc5180 WatchSource:0}: Error finding container 4c4992bcff986cfc2e479c67b0358077f069c56250c1e063c292f2c3e7fc5180: Status 404 returned error can't find the container with id 4c4992bcff986cfc2e479c67b0358077f069c56250c1e063c292f2c3e7fc5180 Sep 30 17:16:24 crc kubenswrapper[4772]: I0930 17:16:24.610097 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5fjbx" event={"ID":"16758c63-4fab-4952-ab4f-765b9f41a1c3","Type":"ContainerStarted","Data":"4c4992bcff986cfc2e479c67b0358077f069c56250c1e063c292f2c3e7fc5180"} Sep 30 17:16:27 crc kubenswrapper[4772]: I0930 17:16:27.232007 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5fjbx"] Sep 30 17:16:27 crc kubenswrapper[4772]: I0930 17:16:27.641004 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5fjbx" event={"ID":"16758c63-4fab-4952-ab4f-765b9f41a1c3","Type":"ContainerStarted","Data":"e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5"} Sep 30 17:16:27 crc kubenswrapper[4772]: I0930 17:16:27.662300 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5fjbx" podStartSLOduration=2.484659005 podStartE2EDuration="4.662270215s" podCreationTimestamp="2025-09-30 17:16:23 +0000 UTC" firstStartedPulling="2025-09-30 17:16:24.602176262 +0000 UTC m=+885.509189093" lastFinishedPulling="2025-09-30 17:16:26.779787472 +0000 UTC m=+887.686800303" observedRunningTime="2025-09-30 17:16:27.655847019 +0000 UTC m=+888.562859870" watchObservedRunningTime="2025-09-30 17:16:27.662270215 +0000 UTC m=+888.569283046" Sep 30 17:16:27 crc kubenswrapper[4772]: I0930 17:16:27.835339 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rg4rc"] Sep 30 17:16:27 crc kubenswrapper[4772]: I0930 17:16:27.836604 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rg4rc" Sep 30 17:16:27 crc kubenswrapper[4772]: I0930 17:16:27.841895 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rg4rc"] Sep 30 17:16:27 crc kubenswrapper[4772]: I0930 17:16:27.866977 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-f68tb" Sep 30 17:16:27 crc kubenswrapper[4772]: I0930 17:16:27.885194 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-q2g5l" Sep 30 17:16:28 crc kubenswrapper[4772]: I0930 17:16:28.006783 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdfn6\" (UniqueName: \"kubernetes.io/projected/04896286-2a65-451e-8639-d0f12941e991-kube-api-access-rdfn6\") pod \"openstack-operator-index-rg4rc\" (UID: \"04896286-2a65-451e-8639-d0f12941e991\") " pod="openstack-operators/openstack-operator-index-rg4rc" Sep 30 17:16:28 crc kubenswrapper[4772]: I0930 17:16:28.108271 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdfn6\" (UniqueName: \"kubernetes.io/projected/04896286-2a65-451e-8639-d0f12941e991-kube-api-access-rdfn6\") pod \"openstack-operator-index-rg4rc\" (UID: \"04896286-2a65-451e-8639-d0f12941e991\") " pod="openstack-operators/openstack-operator-index-rg4rc" Sep 30 17:16:28 crc kubenswrapper[4772]: I0930 17:16:28.127210 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdfn6\" (UniqueName: \"kubernetes.io/projected/04896286-2a65-451e-8639-d0f12941e991-kube-api-access-rdfn6\") pod \"openstack-operator-index-rg4rc\" (UID: \"04896286-2a65-451e-8639-d0f12941e991\") " pod="openstack-operators/openstack-operator-index-rg4rc" Sep 30 17:16:28 crc kubenswrapper[4772]: I0930 17:16:28.160328 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rg4rc" Sep 30 17:16:28 crc kubenswrapper[4772]: I0930 17:16:28.599066 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rg4rc"] Sep 30 17:16:28 crc kubenswrapper[4772]: W0930 17:16:28.603358 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04896286_2a65_451e_8639_d0f12941e991.slice/crio-9daddf776806c9232f1f9b07edc53248880e9c101793a92c6661daf9b3651bf1 WatchSource:0}: Error finding container 9daddf776806c9232f1f9b07edc53248880e9c101793a92c6661daf9b3651bf1: Status 404 returned error can't find the container with id 9daddf776806c9232f1f9b07edc53248880e9c101793a92c6661daf9b3651bf1 Sep 30 17:16:28 crc kubenswrapper[4772]: I0930 17:16:28.648627 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rg4rc" event={"ID":"04896286-2a65-451e-8639-d0f12941e991","Type":"ContainerStarted","Data":"9daddf776806c9232f1f9b07edc53248880e9c101793a92c6661daf9b3651bf1"} Sep 30 17:16:28 crc kubenswrapper[4772]: I0930 17:16:28.648806 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5fjbx" podUID="16758c63-4fab-4952-ab4f-765b9f41a1c3" containerName="registry-server" containerID="cri-o://e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5" gracePeriod=2 Sep 30 17:16:28 crc kubenswrapper[4772]: I0930 17:16:28.986270 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5fjbx" Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.021964 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwmgq\" (UniqueName: \"kubernetes.io/projected/16758c63-4fab-4952-ab4f-765b9f41a1c3-kube-api-access-qwmgq\") pod \"16758c63-4fab-4952-ab4f-765b9f41a1c3\" (UID: \"16758c63-4fab-4952-ab4f-765b9f41a1c3\") " Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.030253 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16758c63-4fab-4952-ab4f-765b9f41a1c3-kube-api-access-qwmgq" (OuterVolumeSpecName: "kube-api-access-qwmgq") pod "16758c63-4fab-4952-ab4f-765b9f41a1c3" (UID: "16758c63-4fab-4952-ab4f-765b9f41a1c3"). InnerVolumeSpecName "kube-api-access-qwmgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.123859 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwmgq\" (UniqueName: \"kubernetes.io/projected/16758c63-4fab-4952-ab4f-765b9f41a1c3-kube-api-access-qwmgq\") on node \"crc\" DevicePath \"\"" Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.657420 4772 generic.go:334] "Generic (PLEG): container finished" podID="16758c63-4fab-4952-ab4f-765b9f41a1c3" containerID="e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5" exitCode=0 Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.657487 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5fjbx" Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.657506 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5fjbx" event={"ID":"16758c63-4fab-4952-ab4f-765b9f41a1c3","Type":"ContainerDied","Data":"e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5"} Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.657545 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5fjbx" event={"ID":"16758c63-4fab-4952-ab4f-765b9f41a1c3","Type":"ContainerDied","Data":"4c4992bcff986cfc2e479c67b0358077f069c56250c1e063c292f2c3e7fc5180"} Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.657564 4772 scope.go:117] "RemoveContainer" containerID="e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5" Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.660766 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rg4rc" event={"ID":"04896286-2a65-451e-8639-d0f12941e991","Type":"ContainerStarted","Data":"fa9891a96e9002725205630e0ccaddafea65a3f0a6227899278a7a305997f465"} Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.675897 4772 scope.go:117] "RemoveContainer" containerID="e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5" Sep 30 17:16:29 crc kubenswrapper[4772]: E0930 17:16:29.676306 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5\": container with ID starting with e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5 not found: ID does not exist" containerID="e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5" Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.676331 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5"} err="failed to get container status \"e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5\": rpc error: code = NotFound desc = could not find container \"e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5\": container with ID starting with e1745912771f8698110a3321158864fa536f5ce85e5d2b0ca73288d00cacd7b5 not found: ID does not exist" Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.683794 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rg4rc" podStartSLOduration=2.633391589 podStartE2EDuration="2.683772821s" podCreationTimestamp="2025-09-30 17:16:27 +0000 UTC" firstStartedPulling="2025-09-30 17:16:28.606924105 +0000 UTC m=+889.513936936" lastFinishedPulling="2025-09-30 17:16:28.657305337 +0000 UTC m=+889.564318168" observedRunningTime="2025-09-30 17:16:29.677866398 +0000 UTC m=+890.584879239" watchObservedRunningTime="2025-09-30 17:16:29.683772821 +0000 UTC m=+890.590785652" Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.693191 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5fjbx"] Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.698488 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5fjbx"] Sep 30 17:16:29 crc kubenswrapper[4772]: I0930 17:16:29.903804 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16758c63-4fab-4952-ab4f-765b9f41a1c3" path="/var/lib/kubelet/pods/16758c63-4fab-4952-ab4f-765b9f41a1c3/volumes" Sep 30 17:16:38 crc kubenswrapper[4772]: I0930 17:16:38.160603 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rg4rc" Sep 30 17:16:38 crc kubenswrapper[4772]: I0930 17:16:38.161337 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rg4rc" Sep 30 17:16:38 crc kubenswrapper[4772]: I0930 17:16:38.189832 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rg4rc" Sep 30 17:16:38 crc kubenswrapper[4772]: I0930 17:16:38.749214 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rg4rc" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.294077 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz"] Sep 30 17:16:40 crc kubenswrapper[4772]: E0930 17:16:40.295271 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16758c63-4fab-4952-ab4f-765b9f41a1c3" containerName="registry-server" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.295287 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="16758c63-4fab-4952-ab4f-765b9f41a1c3" containerName="registry-server" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.295415 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="16758c63-4fab-4952-ab4f-765b9f41a1c3" containerName="registry-server" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.297590 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.304954 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nwhxg" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.311774 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz"] Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.496801 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-bundle\") pod \"d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.496873 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-util\") pod \"d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.496926 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxlww\" (UniqueName: \"kubernetes.io/projected/737191b4-9bb7-402d-bbdb-603bae58da8a-kube-api-access-hxlww\") pod \"d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.598121 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-bundle\") pod \"d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.598192 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-util\") pod \"d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.598240 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxlww\" (UniqueName: \"kubernetes.io/projected/737191b4-9bb7-402d-bbdb-603bae58da8a-kube-api-access-hxlww\") pod \"d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.598706 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-bundle\") pod \"d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.598845 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-util\") pod \"d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.635464 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxlww\" (UniqueName: \"kubernetes.io/projected/737191b4-9bb7-402d-bbdb-603bae58da8a-kube-api-access-hxlww\") pod \"d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:40 crc kubenswrapper[4772]: I0930 17:16:40.666486 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:41 crc kubenswrapper[4772]: I0930 17:16:41.079053 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz"] Sep 30 17:16:41 crc kubenswrapper[4772]: E0930 17:16:41.400601 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod737191b4_9bb7_402d_bbdb_603bae58da8a.slice/crio-ccf98caa429d67e56cf50b55d4fecd7daeb776b7ded9eee764f1a2aaa8dee8a5.scope\": RecentStats: unable to find data in memory cache]" Sep 30 17:16:41 crc kubenswrapper[4772]: I0930 17:16:41.748393 4772 generic.go:334] "Generic (PLEG): container finished" podID="737191b4-9bb7-402d-bbdb-603bae58da8a" containerID="ccf98caa429d67e56cf50b55d4fecd7daeb776b7ded9eee764f1a2aaa8dee8a5" exitCode=0 Sep 30 17:16:41 crc kubenswrapper[4772]: I0930 17:16:41.748440 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" event={"ID":"737191b4-9bb7-402d-bbdb-603bae58da8a","Type":"ContainerDied","Data":"ccf98caa429d67e56cf50b55d4fecd7daeb776b7ded9eee764f1a2aaa8dee8a5"} Sep 30 17:16:41 crc kubenswrapper[4772]: I0930 17:16:41.748508 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" event={"ID":"737191b4-9bb7-402d-bbdb-603bae58da8a","Type":"ContainerStarted","Data":"92bc26e8585fc9456842591e039671ba101c3ccf996bbe02534ba2f0302a733d"} Sep 30 17:16:42 crc kubenswrapper[4772]: I0930 17:16:42.757690 4772 generic.go:334] "Generic (PLEG): container finished" podID="737191b4-9bb7-402d-bbdb-603bae58da8a" containerID="87c1e8cb7d218a2de828d4c54b135ee1afc00b909845ea394fe06f737e30907a" exitCode=0 Sep 30 17:16:42 crc kubenswrapper[4772]: I0930 17:16:42.757739 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" event={"ID":"737191b4-9bb7-402d-bbdb-603bae58da8a","Type":"ContainerDied","Data":"87c1e8cb7d218a2de828d4c54b135ee1afc00b909845ea394fe06f737e30907a"} Sep 30 17:16:43 crc kubenswrapper[4772]: I0930 17:16:43.770931 4772 generic.go:334] "Generic (PLEG): container finished" podID="737191b4-9bb7-402d-bbdb-603bae58da8a" containerID="92b410d4ef85b1bdff6905a415a934c5877438deeb16a0382b3b5921ce5316c0" exitCode=0 Sep 30 17:16:43 crc kubenswrapper[4772]: I0930 17:16:43.770989 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" event={"ID":"737191b4-9bb7-402d-bbdb-603bae58da8a","Type":"ContainerDied","Data":"92b410d4ef85b1bdff6905a415a934c5877438deeb16a0382b3b5921ce5316c0"} Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.048601 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.194393 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-bundle\") pod \"737191b4-9bb7-402d-bbdb-603bae58da8a\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.195051 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxlww\" (UniqueName: \"kubernetes.io/projected/737191b4-9bb7-402d-bbdb-603bae58da8a-kube-api-access-hxlww\") pod \"737191b4-9bb7-402d-bbdb-603bae58da8a\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.195124 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-util\") pod \"737191b4-9bb7-402d-bbdb-603bae58da8a\" (UID: \"737191b4-9bb7-402d-bbdb-603bae58da8a\") " Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.196170 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-bundle" (OuterVolumeSpecName: "bundle") pod "737191b4-9bb7-402d-bbdb-603bae58da8a" (UID: "737191b4-9bb7-402d-bbdb-603bae58da8a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.201809 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.203310 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737191b4-9bb7-402d-bbdb-603bae58da8a-kube-api-access-hxlww" (OuterVolumeSpecName: "kube-api-access-hxlww") pod "737191b4-9bb7-402d-bbdb-603bae58da8a" (UID: "737191b4-9bb7-402d-bbdb-603bae58da8a"). InnerVolumeSpecName "kube-api-access-hxlww". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.207437 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-util" (OuterVolumeSpecName: "util") pod "737191b4-9bb7-402d-bbdb-603bae58da8a" (UID: "737191b4-9bb7-402d-bbdb-603bae58da8a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.302879 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxlww\" (UniqueName: \"kubernetes.io/projected/737191b4-9bb7-402d-bbdb-603bae58da8a-kube-api-access-hxlww\") on node \"crc\" DevicePath \"\"" Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.302938 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/737191b4-9bb7-402d-bbdb-603bae58da8a-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.806701 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" event={"ID":"737191b4-9bb7-402d-bbdb-603bae58da8a","Type":"ContainerDied","Data":"92bc26e8585fc9456842591e039671ba101c3ccf996bbe02534ba2f0302a733d"} Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.806757 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92bc26e8585fc9456842591e039671ba101c3ccf996bbe02534ba2f0302a733d" Sep 30 17:16:47 crc kubenswrapper[4772]: I0930 17:16:47.806843 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.004859 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx"] Sep 30 17:16:53 crc kubenswrapper[4772]: E0930 17:16:53.005739 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737191b4-9bb7-402d-bbdb-603bae58da8a" containerName="pull" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.005753 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="737191b4-9bb7-402d-bbdb-603bae58da8a" containerName="pull" Sep 30 17:16:53 crc kubenswrapper[4772]: E0930 17:16:53.005767 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737191b4-9bb7-402d-bbdb-603bae58da8a" containerName="extract" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.005773 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="737191b4-9bb7-402d-bbdb-603bae58da8a" containerName="extract" Sep 30 17:16:53 crc kubenswrapper[4772]: E0930 17:16:53.005786 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737191b4-9bb7-402d-bbdb-603bae58da8a" containerName="util" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.005793 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="737191b4-9bb7-402d-bbdb-603bae58da8a" containerName="util" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.005897 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="737191b4-9bb7-402d-bbdb-603bae58da8a" containerName="extract" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.006535 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.009882 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-4dfth" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.041042 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx"] Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.077921 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zttkh\" (UniqueName: \"kubernetes.io/projected/6d89b985-cd07-43bc-9024-ff6ffd1adc45-kube-api-access-zttkh\") pod \"openstack-operator-controller-operator-5959786844-tbxrx\" (UID: \"6d89b985-cd07-43bc-9024-ff6ffd1adc45\") " pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.179568 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zttkh\" (UniqueName: \"kubernetes.io/projected/6d89b985-cd07-43bc-9024-ff6ffd1adc45-kube-api-access-zttkh\") pod \"openstack-operator-controller-operator-5959786844-tbxrx\" (UID: \"6d89b985-cd07-43bc-9024-ff6ffd1adc45\") " pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.208897 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zttkh\" (UniqueName: \"kubernetes.io/projected/6d89b985-cd07-43bc-9024-ff6ffd1adc45-kube-api-access-zttkh\") pod \"openstack-operator-controller-operator-5959786844-tbxrx\" (UID: \"6d89b985-cd07-43bc-9024-ff6ffd1adc45\") " pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.324178 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.763608 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx"] Sep 30 17:16:53 crc kubenswrapper[4772]: I0930 17:16:53.846277 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" event={"ID":"6d89b985-cd07-43bc-9024-ff6ffd1adc45","Type":"ContainerStarted","Data":"29da9e73d4896594dab7efec2bad1a095dca14b0b587a0424a35802e796f2d66"} Sep 30 17:16:57 crc kubenswrapper[4772]: I0930 17:16:57.873025 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" event={"ID":"6d89b985-cd07-43bc-9024-ff6ffd1adc45","Type":"ContainerStarted","Data":"04010e7fdf1b81b1b807be5d4376b697d57ef1f660ae7c57e0d67bccd2d67098"} Sep 30 17:17:00 crc kubenswrapper[4772]: I0930 17:17:00.894338 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" event={"ID":"6d89b985-cd07-43bc-9024-ff6ffd1adc45","Type":"ContainerStarted","Data":"2a0569182da37a344b517716585891319a0a8faef80f6b4b3e519990646e3f08"} Sep 30 17:17:00 crc kubenswrapper[4772]: I0930 17:17:00.894979 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" Sep 30 17:17:00 crc kubenswrapper[4772]: I0930 17:17:00.943059 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" podStartSLOduration=2.400163595 podStartE2EDuration="8.943032115s" podCreationTimestamp="2025-09-30 17:16:52 +0000 UTC" firstStartedPulling="2025-09-30 17:16:53.760942548 +0000 UTC m=+914.667955389" lastFinishedPulling="2025-09-30 17:17:00.303811078 +0000 UTC m=+921.210823909" observedRunningTime="2025-09-30 17:17:00.940582721 +0000 UTC m=+921.847595582" watchObservedRunningTime="2025-09-30 17:17:00.943032115 +0000 UTC m=+921.850044946" Sep 30 17:17:03 crc kubenswrapper[4772]: I0930 17:17:03.326219 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5959786844-tbxrx" Sep 30 17:17:08 crc kubenswrapper[4772]: I0930 17:17:08.655864 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:17:08 crc kubenswrapper[4772]: I0930 17:17:08.656617 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.554020 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.555781 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.559869 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-n6bk2" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.564347 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.565708 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.568200 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dqjkl" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.581972 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.597589 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.599211 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.611313 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-lmggp" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.628794 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.630284 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.633826 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-dmm96" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.640043 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.652145 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.656612 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.674120 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.675411 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.676564 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.678411 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-f2f5c" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.686842 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pm95\" (UniqueName: \"kubernetes.io/projected/832335d3-7446-4879-8ec1-8f24d6d3708a-kube-api-access-6pm95\") pod \"designate-operator-controller-manager-84f4f7b77b-rqv96\" (UID: \"832335d3-7446-4879-8ec1-8f24d6d3708a\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.686878 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcnjc\" (UniqueName: \"kubernetes.io/projected/314c8eb1-ee8d-405d-9bb6-a74de21c2f01-kube-api-access-pcnjc\") pod \"glance-operator-controller-manager-84958c4d49-ghtj2\" (UID: \"314c8eb1-ee8d-405d-9bb6-a74de21c2f01\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.686943 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qglhp\" (UniqueName: \"kubernetes.io/projected/27e94b49-6017-4790-af32-61cdb6c41f2c-kube-api-access-qglhp\") pod \"heat-operator-controller-manager-5d889d78cf-b82x8\" (UID: \"27e94b49-6017-4790-af32-61cdb6c41f2c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.686980 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4jrq\" (UniqueName: \"kubernetes.io/projected/ad2965ed-ed78-4646-97ae-07cce49e8eb1-kube-api-access-x4jrq\") pod \"cinder-operator-controller-manager-644bddb6d8-n7w7p\" (UID: \"ad2965ed-ed78-4646-97ae-07cce49e8eb1\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.686999 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p87fc\" (UniqueName: \"kubernetes.io/projected/80d5010e-a767-491b-bcb2-89272762a121-kube-api-access-p87fc\") pod \"barbican-operator-controller-manager-6ff8b75857-tzz8m\" (UID: \"80d5010e-a767-491b-bcb2-89272762a121\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.690158 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.691136 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.695568 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-w6vfc" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.724153 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.726767 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.731426 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-hh7d6" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.731826 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.738983 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.753904 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.755079 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.758368 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-wq6sr" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.766752 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.768337 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.771333 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-d4s8m" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.777856 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.779885 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.791082 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.792281 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qglhp\" (UniqueName: \"kubernetes.io/projected/27e94b49-6017-4790-af32-61cdb6c41f2c-kube-api-access-qglhp\") pod \"heat-operator-controller-manager-5d889d78cf-b82x8\" (UID: \"27e94b49-6017-4790-af32-61cdb6c41f2c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.792345 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4jrq\" (UniqueName: \"kubernetes.io/projected/ad2965ed-ed78-4646-97ae-07cce49e8eb1-kube-api-access-x4jrq\") pod \"cinder-operator-controller-manager-644bddb6d8-n7w7p\" (UID: \"ad2965ed-ed78-4646-97ae-07cce49e8eb1\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.792367 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p87fc\" (UniqueName: \"kubernetes.io/projected/80d5010e-a767-491b-bcb2-89272762a121-kube-api-access-p87fc\") pod \"barbican-operator-controller-manager-6ff8b75857-tzz8m\" (UID: \"80d5010e-a767-491b-bcb2-89272762a121\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.792390 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pm95\" (UniqueName: \"kubernetes.io/projected/832335d3-7446-4879-8ec1-8f24d6d3708a-kube-api-access-6pm95\") pod \"designate-operator-controller-manager-84f4f7b77b-rqv96\" (UID: \"832335d3-7446-4879-8ec1-8f24d6d3708a\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.792413 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcnjc\" (UniqueName: \"kubernetes.io/projected/314c8eb1-ee8d-405d-9bb6-a74de21c2f01-kube-api-access-pcnjc\") pod \"glance-operator-controller-manager-84958c4d49-ghtj2\" (UID: \"314c8eb1-ee8d-405d-9bb6-a74de21c2f01\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.800125 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.801401 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.806918 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rn9hs" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.820290 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.829252 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.832854 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.836191 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-8rz9k" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.845087 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pm95\" (UniqueName: \"kubernetes.io/projected/832335d3-7446-4879-8ec1-8f24d6d3708a-kube-api-access-6pm95\") pod \"designate-operator-controller-manager-84f4f7b77b-rqv96\" (UID: \"832335d3-7446-4879-8ec1-8f24d6d3708a\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.847015 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qglhp\" (UniqueName: \"kubernetes.io/projected/27e94b49-6017-4790-af32-61cdb6c41f2c-kube-api-access-qglhp\") pod \"heat-operator-controller-manager-5d889d78cf-b82x8\" (UID: \"27e94b49-6017-4790-af32-61cdb6c41f2c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.857772 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcnjc\" (UniqueName: \"kubernetes.io/projected/314c8eb1-ee8d-405d-9bb6-a74de21c2f01-kube-api-access-pcnjc\") pod \"glance-operator-controller-manager-84958c4d49-ghtj2\" (UID: \"314c8eb1-ee8d-405d-9bb6-a74de21c2f01\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.864753 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4jrq\" (UniqueName: \"kubernetes.io/projected/ad2965ed-ed78-4646-97ae-07cce49e8eb1-kube-api-access-x4jrq\") pod \"cinder-operator-controller-manager-644bddb6d8-n7w7p\" (UID: \"ad2965ed-ed78-4646-97ae-07cce49e8eb1\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.865098 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p87fc\" (UniqueName: \"kubernetes.io/projected/80d5010e-a767-491b-bcb2-89272762a121-kube-api-access-p87fc\") pod \"barbican-operator-controller-manager-6ff8b75857-tzz8m\" (UID: \"80d5010e-a767-491b-bcb2-89272762a121\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.873118 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.874641 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.878489 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.880212 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qddkj" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.883575 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-z472f"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.889623 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.892412 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.893078 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc8g7\" (UniqueName: \"kubernetes.io/projected/d10d7495-42f5-4919-8985-99913d62ab28-kube-api-access-hc8g7\") pod \"ironic-operator-controller-manager-7975b88857-vdhkv\" (UID: \"d10d7495-42f5-4919-8985-99913d62ab28\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.893104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/058eec37-9f59-4fc5-8fa3-c9595bf58300-cert\") pod \"infra-operator-controller-manager-7d857cc749-shhhk\" (UID: \"058eec37-9f59-4fc5-8fa3-c9595bf58300\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.893124 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qg9p\" (UniqueName: \"kubernetes.io/projected/44aed112-2ebc-48b6-b3b4-9a47d2dafaa9-kube-api-access-6qg9p\") pod \"keystone-operator-controller-manager-5bd55b4bff-56vtc\" (UID: \"44aed112-2ebc-48b6-b3b4-9a47d2dafaa9\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.893159 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzcpn\" (UniqueName: \"kubernetes.io/projected/b7ba1160-070d-4cc4-9c53-75817bd6141e-kube-api-access-pzcpn\") pod \"horizon-operator-controller-manager-9f4696d94-dn2kt\" (UID: \"b7ba1160-070d-4cc4-9c53-75817bd6141e\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.893184 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmdjc\" (UniqueName: \"kubernetes.io/projected/058eec37-9f59-4fc5-8fa3-c9595bf58300-kube-api-access-cmdjc\") pod \"infra-operator-controller-manager-7d857cc749-shhhk\" (UID: \"058eec37-9f59-4fc5-8fa3-c9595bf58300\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.893297 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.895169 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.895958 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-s9f66" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.902040 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.902968 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-w84xt" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.913502 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.936836 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-z472f"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.946776 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.947963 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.956878 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v"] Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.994952 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdjc\" (UniqueName: \"kubernetes.io/projected/058eec37-9f59-4fc5-8fa3-c9595bf58300-kube-api-access-cmdjc\") pod \"infra-operator-controller-manager-7d857cc749-shhhk\" (UID: \"058eec37-9f59-4fc5-8fa3-c9595bf58300\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.995426 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nvzb\" (UniqueName: \"kubernetes.io/projected/c886af64-f9cc-4127-9d17-3007ae492d06-kube-api-access-9nvzb\") pod \"manila-operator-controller-manager-6d68dbc695-2fx6p\" (UID: \"c886af64-f9cc-4127-9d17-3007ae492d06\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.995547 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzctf\" (UniqueName: \"kubernetes.io/projected/df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff-kube-api-access-tzctf\") pod \"nova-operator-controller-manager-c7c776c96-z472f\" (UID: \"df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.995645 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vmvd\" (UniqueName: \"kubernetes.io/projected/1753608a-67af-4fa4-83f1-3f7d1623fc6b-kube-api-access-2vmvd\") pod \"mariadb-operator-controller-manager-88c7-vlpqf\" (UID: \"1753608a-67af-4fa4-83f1-3f7d1623fc6b\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.995762 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc8g7\" (UniqueName: \"kubernetes.io/projected/d10d7495-42f5-4919-8985-99913d62ab28-kube-api-access-hc8g7\") pod \"ironic-operator-controller-manager-7975b88857-vdhkv\" (UID: \"d10d7495-42f5-4919-8985-99913d62ab28\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.995863 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7z7j\" (UniqueName: \"kubernetes.io/projected/51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc-kube-api-access-t7z7j\") pod \"neutron-operator-controller-manager-64d7b59854-smllw\" (UID: \"51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.995966 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/058eec37-9f59-4fc5-8fa3-c9595bf58300-cert\") pod \"infra-operator-controller-manager-7d857cc749-shhhk\" (UID: \"058eec37-9f59-4fc5-8fa3-c9595bf58300\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.996077 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qg9p\" (UniqueName: \"kubernetes.io/projected/44aed112-2ebc-48b6-b3b4-9a47d2dafaa9-kube-api-access-6qg9p\") pod \"keystone-operator-controller-manager-5bd55b4bff-56vtc\" (UID: \"44aed112-2ebc-48b6-b3b4-9a47d2dafaa9\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" Sep 30 17:17:20 crc kubenswrapper[4772]: I0930 17:17:20.996251 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzcpn\" (UniqueName: \"kubernetes.io/projected/b7ba1160-070d-4cc4-9c53-75817bd6141e-kube-api-access-pzcpn\") pod \"horizon-operator-controller-manager-9f4696d94-dn2kt\" (UID: \"b7ba1160-070d-4cc4-9c53-75817bd6141e\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" Sep 30 17:17:20 crc kubenswrapper[4772]: E0930 17:17:20.997080 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 17:17:20 crc kubenswrapper[4772]: E0930 17:17:20.999606 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/058eec37-9f59-4fc5-8fa3-c9595bf58300-cert podName:058eec37-9f59-4fc5-8fa3-c9595bf58300 nodeName:}" failed. No retries permitted until 2025-09-30 17:17:21.49956337 +0000 UTC m=+942.406576201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/058eec37-9f59-4fc5-8fa3-c9595bf58300-cert") pod "infra-operator-controller-manager-7d857cc749-shhhk" (UID: "058eec37-9f59-4fc5-8fa3-c9595bf58300") : secret "infra-operator-webhook-server-cert" not found Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.002280 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.063645 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzcpn\" (UniqueName: \"kubernetes.io/projected/b7ba1160-070d-4cc4-9c53-75817bd6141e-kube-api-access-pzcpn\") pod \"horizon-operator-controller-manager-9f4696d94-dn2kt\" (UID: \"b7ba1160-070d-4cc4-9c53-75817bd6141e\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.064781 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmdjc\" (UniqueName: \"kubernetes.io/projected/058eec37-9f59-4fc5-8fa3-c9595bf58300-kube-api-access-cmdjc\") pod \"infra-operator-controller-manager-7d857cc749-shhhk\" (UID: \"058eec37-9f59-4fc5-8fa3-c9595bf58300\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.067167 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc8g7\" (UniqueName: \"kubernetes.io/projected/d10d7495-42f5-4919-8985-99913d62ab28-kube-api-access-hc8g7\") pod \"ironic-operator-controller-manager-7975b88857-vdhkv\" (UID: \"d10d7495-42f5-4919-8985-99913d62ab28\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.069011 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qg9p\" (UniqueName: \"kubernetes.io/projected/44aed112-2ebc-48b6-b3b4-9a47d2dafaa9-kube-api-access-6qg9p\") pod \"keystone-operator-controller-manager-5bd55b4bff-56vtc\" (UID: \"44aed112-2ebc-48b6-b3b4-9a47d2dafaa9\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.070737 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.081885 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.086625 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-4lvpx" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.091874 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.100224 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxlvn\" (UniqueName: \"kubernetes.io/projected/13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36-kube-api-access-zxlvn\") pod \"octavia-operator-controller-manager-76fcc6dc7c-kpr6v\" (UID: \"13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.100899 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nvzb\" (UniqueName: \"kubernetes.io/projected/c886af64-f9cc-4127-9d17-3007ae492d06-kube-api-access-9nvzb\") pod \"manila-operator-controller-manager-6d68dbc695-2fx6p\" (UID: \"c886af64-f9cc-4127-9d17-3007ae492d06\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.102679 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzctf\" (UniqueName: \"kubernetes.io/projected/df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff-kube-api-access-tzctf\") pod \"nova-operator-controller-manager-c7c776c96-z472f\" (UID: \"df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.102735 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vmvd\" (UniqueName: \"kubernetes.io/projected/1753608a-67af-4fa4-83f1-3f7d1623fc6b-kube-api-access-2vmvd\") pod \"mariadb-operator-controller-manager-88c7-vlpqf\" (UID: \"1753608a-67af-4fa4-83f1-3f7d1623fc6b\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.102796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7z7j\" (UniqueName: \"kubernetes.io/projected/51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc-kube-api-access-t7z7j\") pod \"neutron-operator-controller-manager-64d7b59854-smllw\" (UID: \"51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.116173 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.137675 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzctf\" (UniqueName: \"kubernetes.io/projected/df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff-kube-api-access-tzctf\") pod \"nova-operator-controller-manager-c7c776c96-z472f\" (UID: \"df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.137773 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.138347 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7z7j\" (UniqueName: \"kubernetes.io/projected/51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc-kube-api-access-t7z7j\") pod \"neutron-operator-controller-manager-64d7b59854-smllw\" (UID: \"51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.154354 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.159971 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nvzb\" (UniqueName: \"kubernetes.io/projected/c886af64-f9cc-4127-9d17-3007ae492d06-kube-api-access-9nvzb\") pod \"manila-operator-controller-manager-6d68dbc695-2fx6p\" (UID: \"c886af64-f9cc-4127-9d17-3007ae492d06\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.162446 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vmvd\" (UniqueName: \"kubernetes.io/projected/1753608a-67af-4fa4-83f1-3f7d1623fc6b-kube-api-access-2vmvd\") pod \"mariadb-operator-controller-manager-88c7-vlpqf\" (UID: \"1753608a-67af-4fa4-83f1-3f7d1623fc6b\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.167822 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.171207 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-k7hp7" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.173023 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.179618 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.179881 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-9mrdx" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.208254 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxlvn\" (UniqueName: \"kubernetes.io/projected/13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36-kube-api-access-zxlvn\") pod \"octavia-operator-controller-manager-76fcc6dc7c-kpr6v\" (UID: \"13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.208375 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvp4n\" (UniqueName: \"kubernetes.io/projected/6c9f85e1-5df7-4943-9064-69af6e200e82-kube-api-access-pvp4n\") pod \"ovn-operator-controller-manager-9976ff44c-8xbv5\" (UID: \"6c9f85e1-5df7-4943-9064-69af6e200e82\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.224316 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.246149 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.247643 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.250377 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8nsqt" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.266703 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.275728 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.285091 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxlvn\" (UniqueName: \"kubernetes.io/projected/13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36-kube-api-access-zxlvn\") pod \"octavia-operator-controller-manager-76fcc6dc7c-kpr6v\" (UID: \"13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.288736 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-74wxc" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.291564 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.309297 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69e18d49-1290-4440-a3c9-885352fa18c5-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-dxbpz\" (UID: \"69e18d49-1290-4440-a3c9-885352fa18c5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.309376 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvp4n\" (UniqueName: \"kubernetes.io/projected/6c9f85e1-5df7-4943-9064-69af6e200e82-kube-api-access-pvp4n\") pod \"ovn-operator-controller-manager-9976ff44c-8xbv5\" (UID: \"6c9f85e1-5df7-4943-9064-69af6e200e82\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.309399 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6fk\" (UniqueName: \"kubernetes.io/projected/f3a0e5a3-c50e-48ce-801d-f7916210165b-kube-api-access-cw6fk\") pod \"placement-operator-controller-manager-589c58c6c-mfnlh\" (UID: \"f3a0e5a3-c50e-48ce-801d-f7916210165b\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.309427 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md5fv\" (UniqueName: \"kubernetes.io/projected/69e18d49-1290-4440-a3c9-885352fa18c5-kube-api-access-md5fv\") pod \"openstack-baremetal-operator-controller-manager-6d776955-dxbpz\" (UID: \"69e18d49-1290-4440-a3c9-885352fa18c5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.324416 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.336432 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.359670 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.370403 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.376241 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.381169 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvp4n\" (UniqueName: \"kubernetes.io/projected/6c9f85e1-5df7-4943-9064-69af6e200e82-kube-api-access-pvp4n\") pod \"ovn-operator-controller-manager-9976ff44c-8xbv5\" (UID: \"6c9f85e1-5df7-4943-9064-69af6e200e82\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.381793 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.400888 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.411260 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.412976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qbpj\" (UniqueName: \"kubernetes.io/projected/f8af3992-c401-4dea-b5a5-92063a05384e-kube-api-access-9qbpj\") pod \"swift-operator-controller-manager-bc7dc7bd9-6m9mb\" (UID: \"f8af3992-c401-4dea-b5a5-92063a05384e\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.413035 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69e18d49-1290-4440-a3c9-885352fa18c5-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-dxbpz\" (UID: \"69e18d49-1290-4440-a3c9-885352fa18c5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.413146 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6fk\" (UniqueName: \"kubernetes.io/projected/f3a0e5a3-c50e-48ce-801d-f7916210165b-kube-api-access-cw6fk\") pod \"placement-operator-controller-manager-589c58c6c-mfnlh\" (UID: \"f3a0e5a3-c50e-48ce-801d-f7916210165b\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.413191 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md5fv\" (UniqueName: \"kubernetes.io/projected/69e18d49-1290-4440-a3c9-885352fa18c5-kube-api-access-md5fv\") pod \"openstack-baremetal-operator-controller-manager-6d776955-dxbpz\" (UID: \"69e18d49-1290-4440-a3c9-885352fa18c5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.413238 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svsdx\" (UniqueName: \"kubernetes.io/projected/d4295a68-a2dc-4b0b-a577-bbd6448d3a70-kube-api-access-svsdx\") pod \"telemetry-operator-controller-manager-b8d54b5d7-b6np7\" (UID: \"d4295a68-a2dc-4b0b-a577-bbd6448d3a70\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" Sep 30 17:17:21 crc kubenswrapper[4772]: E0930 17:17:21.413443 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 17:17:21 crc kubenswrapper[4772]: E0930 17:17:21.413496 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69e18d49-1290-4440-a3c9-885352fa18c5-cert podName:69e18d49-1290-4440-a3c9-885352fa18c5 nodeName:}" failed. No retries permitted until 2025-09-30 17:17:21.913477245 +0000 UTC m=+942.820490076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/69e18d49-1290-4440-a3c9-885352fa18c5-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-dxbpz" (UID: "69e18d49-1290-4440-a3c9-885352fa18c5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.421510 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.434827 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.459016 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md5fv\" (UniqueName: \"kubernetes.io/projected/69e18d49-1290-4440-a3c9-885352fa18c5-kube-api-access-md5fv\") pod \"openstack-baremetal-operator-controller-manager-6d776955-dxbpz\" (UID: \"69e18d49-1290-4440-a3c9-885352fa18c5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.469245 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6fk\" (UniqueName: \"kubernetes.io/projected/f3a0e5a3-c50e-48ce-801d-f7916210165b-kube-api-access-cw6fk\") pod \"placement-operator-controller-manager-589c58c6c-mfnlh\" (UID: \"f3a0e5a3-c50e-48ce-801d-f7916210165b\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.492355 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.500078 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.504550 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-45frb" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.514133 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qbpj\" (UniqueName: \"kubernetes.io/projected/f8af3992-c401-4dea-b5a5-92063a05384e-kube-api-access-9qbpj\") pod \"swift-operator-controller-manager-bc7dc7bd9-6m9mb\" (UID: \"f8af3992-c401-4dea-b5a5-92063a05384e\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.514203 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/058eec37-9f59-4fc5-8fa3-c9595bf58300-cert\") pod \"infra-operator-controller-manager-7d857cc749-shhhk\" (UID: \"058eec37-9f59-4fc5-8fa3-c9595bf58300\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.514267 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svsdx\" (UniqueName: \"kubernetes.io/projected/d4295a68-a2dc-4b0b-a577-bbd6448d3a70-kube-api-access-svsdx\") pod \"telemetry-operator-controller-manager-b8d54b5d7-b6np7\" (UID: \"d4295a68-a2dc-4b0b-a577-bbd6448d3a70\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.522811 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/058eec37-9f59-4fc5-8fa3-c9595bf58300-cert\") pod \"infra-operator-controller-manager-7d857cc749-shhhk\" (UID: \"058eec37-9f59-4fc5-8fa3-c9595bf58300\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.530447 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.550782 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.554560 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svsdx\" (UniqueName: \"kubernetes.io/projected/d4295a68-a2dc-4b0b-a577-bbd6448d3a70-kube-api-access-svsdx\") pod \"telemetry-operator-controller-manager-b8d54b5d7-b6np7\" (UID: \"d4295a68-a2dc-4b0b-a577-bbd6448d3a70\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.575324 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qbpj\" (UniqueName: \"kubernetes.io/projected/f8af3992-c401-4dea-b5a5-92063a05384e-kube-api-access-9qbpj\") pod \"swift-operator-controller-manager-bc7dc7bd9-6m9mb\" (UID: \"f8af3992-c401-4dea-b5a5-92063a05384e\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.617477 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqd2j\" (UniqueName: \"kubernetes.io/projected/5b10f12b-b24a-4cf6-b07b-7b3e811ccd30-kube-api-access-jqd2j\") pod \"test-operator-controller-manager-f66b554c6-xmwpp\" (UID: \"5b10f12b-b24a-4cf6-b07b-7b3e811ccd30\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.626586 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.627772 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.631536 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wqhxg" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.655817 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.674413 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.721011 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.722039 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqd2j\" (UniqueName: \"kubernetes.io/projected/5b10f12b-b24a-4cf6-b07b-7b3e811ccd30-kube-api-access-jqd2j\") pod \"test-operator-controller-manager-f66b554c6-xmwpp\" (UID: \"5b10f12b-b24a-4cf6-b07b-7b3e811ccd30\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.722126 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lts5j\" (UniqueName: \"kubernetes.io/projected/4fcd6b42-8644-41f5-bd3b-51184d34cd00-kube-api-access-lts5j\") pod \"watcher-operator-controller-manager-86c75f6bd4-4fnzg\" (UID: \"4fcd6b42-8644-41f5-bd3b-51184d34cd00\") " pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.750187 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqd2j\" (UniqueName: \"kubernetes.io/projected/5b10f12b-b24a-4cf6-b07b-7b3e811ccd30-kube-api-access-jqd2j\") pod \"test-operator-controller-manager-f66b554c6-xmwpp\" (UID: \"5b10f12b-b24a-4cf6-b07b-7b3e811ccd30\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.764548 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.773134 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.773255 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.777748 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.777787 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tpgg2" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.823529 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lts5j\" (UniqueName: \"kubernetes.io/projected/4fcd6b42-8644-41f5-bd3b-51184d34cd00-kube-api-access-lts5j\") pod \"watcher-operator-controller-manager-86c75f6bd4-4fnzg\" (UID: \"4fcd6b42-8644-41f5-bd3b-51184d34cd00\") " pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.874295 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.881128 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc"] Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.884500 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.893210 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-zskzd" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.913500 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.914647 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lts5j\" (UniqueName: \"kubernetes.io/projected/4fcd6b42-8644-41f5-bd3b-51184d34cd00-kube-api-access-lts5j\") pod \"watcher-operator-controller-manager-86c75f6bd4-4fnzg\" (UID: \"4fcd6b42-8644-41f5-bd3b-51184d34cd00\") " pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.917169 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.925751 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5be7b91-f881-4cd5-878e-1d40a94a3a8d-cert\") pod \"openstack-operator-controller-manager-5dd9b5767f-p4n9f\" (UID: \"d5be7b91-f881-4cd5-878e-1d40a94a3a8d\") " pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.925940 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp967\" (UniqueName: \"kubernetes.io/projected/d5be7b91-f881-4cd5-878e-1d40a94a3a8d-kube-api-access-xp967\") pod \"openstack-operator-controller-manager-5dd9b5767f-p4n9f\" (UID: \"d5be7b91-f881-4cd5-878e-1d40a94a3a8d\") " pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:21 crc kubenswrapper[4772]: I0930 17:17:21.926135 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69e18d49-1290-4440-a3c9-885352fa18c5-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-dxbpz\" (UID: \"69e18d49-1290-4440-a3c9-885352fa18c5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:21 crc kubenswrapper[4772]: E0930 17:17:21.926718 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 17:17:21 crc kubenswrapper[4772]: E0930 17:17:21.926943 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69e18d49-1290-4440-a3c9-885352fa18c5-cert podName:69e18d49-1290-4440-a3c9-885352fa18c5 nodeName:}" failed. No retries permitted until 2025-09-30 17:17:22.92681596 +0000 UTC m=+943.833828791 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/69e18d49-1290-4440-a3c9-885352fa18c5-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-dxbpz" (UID: "69e18d49-1290-4440-a3c9-885352fa18c5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.014010 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc"] Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.014080 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p"] Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.014098 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m"] Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.022005 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.030746 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h6mx\" (UniqueName: \"kubernetes.io/projected/1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f-kube-api-access-2h6mx\") pod \"rabbitmq-cluster-operator-manager-79d8469568-swgvc\" (UID: \"1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc" Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.031041 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5be7b91-f881-4cd5-878e-1d40a94a3a8d-cert\") pod \"openstack-operator-controller-manager-5dd9b5767f-p4n9f\" (UID: \"d5be7b91-f881-4cd5-878e-1d40a94a3a8d\") " pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.031186 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp967\" (UniqueName: \"kubernetes.io/projected/d5be7b91-f881-4cd5-878e-1d40a94a3a8d-kube-api-access-xp967\") pod \"openstack-operator-controller-manager-5dd9b5767f-p4n9f\" (UID: \"d5be7b91-f881-4cd5-878e-1d40a94a3a8d\") " pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:22 crc kubenswrapper[4772]: E0930 17:17:22.031975 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 17:17:22 crc kubenswrapper[4772]: E0930 17:17:22.032205 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5be7b91-f881-4cd5-878e-1d40a94a3a8d-cert podName:d5be7b91-f881-4cd5-878e-1d40a94a3a8d nodeName:}" failed. No retries permitted until 2025-09-30 17:17:22.532169293 +0000 UTC m=+943.439182314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5be7b91-f881-4cd5-878e-1d40a94a3a8d-cert") pod "openstack-operator-controller-manager-5dd9b5767f-p4n9f" (UID: "d5be7b91-f881-4cd5-878e-1d40a94a3a8d") : secret "webhook-server-cert" not found Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.065676 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp967\" (UniqueName: \"kubernetes.io/projected/d5be7b91-f881-4cd5-878e-1d40a94a3a8d-kube-api-access-xp967\") pod \"openstack-operator-controller-manager-5dd9b5767f-p4n9f\" (UID: \"d5be7b91-f881-4cd5-878e-1d40a94a3a8d\") " pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.111931 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" event={"ID":"ad2965ed-ed78-4646-97ae-07cce49e8eb1","Type":"ContainerStarted","Data":"2948eaa531bdafc105bf4e7b53228ef292babaffd3672b7f5e045fed4a4c9d9c"} Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.113935 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" event={"ID":"80d5010e-a767-491b-bcb2-89272762a121","Type":"ContainerStarted","Data":"640acc0669ab105a8df3927275b75002ba1077c6d9c65fffa2f5725f19755063"} Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.134130 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h6mx\" (UniqueName: \"kubernetes.io/projected/1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f-kube-api-access-2h6mx\") pod \"rabbitmq-cluster-operator-manager-79d8469568-swgvc\" (UID: \"1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc" Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.167459 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h6mx\" (UniqueName: \"kubernetes.io/projected/1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f-kube-api-access-2h6mx\") pod \"rabbitmq-cluster-operator-manager-79d8469568-swgvc\" (UID: \"1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc" Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.315215 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96"] Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.348640 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc"] Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.395311 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc" Sep 30 17:17:22 crc kubenswrapper[4772]: W0930 17:17:22.427769 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod832335d3_7446_4879_8ec1_8f24d6d3708a.slice/crio-2f74a4b3b9573953e0835798a1890152a9a11ff82af17ee8fad4facad74153d2 WatchSource:0}: Error finding container 2f74a4b3b9573953e0835798a1890152a9a11ff82af17ee8fad4facad74153d2: Status 404 returned error can't find the container with id 2f74a4b3b9573953e0835798a1890152a9a11ff82af17ee8fad4facad74153d2 Sep 30 17:17:22 crc kubenswrapper[4772]: W0930 17:17:22.448052 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44aed112_2ebc_48b6_b3b4_9a47d2dafaa9.slice/crio-1d77d8d6d73e9ada5b8634c27ceb7be6e3f29b0c75310e590be2cbe41eb64481 WatchSource:0}: Error finding container 1d77d8d6d73e9ada5b8634c27ceb7be6e3f29b0c75310e590be2cbe41eb64481: Status 404 returned error can't find the container with id 1d77d8d6d73e9ada5b8634c27ceb7be6e3f29b0c75310e590be2cbe41eb64481 Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.540967 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5be7b91-f881-4cd5-878e-1d40a94a3a8d-cert\") pod \"openstack-operator-controller-manager-5dd9b5767f-p4n9f\" (UID: \"d5be7b91-f881-4cd5-878e-1d40a94a3a8d\") " pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.549763 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5be7b91-f881-4cd5-878e-1d40a94a3a8d-cert\") pod \"openstack-operator-controller-manager-5dd9b5767f-p4n9f\" (UID: \"d5be7b91-f881-4cd5-878e-1d40a94a3a8d\") " pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.552091 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.812015 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2"] Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.946127 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69e18d49-1290-4440-a3c9-885352fa18c5-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-dxbpz\" (UID: \"69e18d49-1290-4440-a3c9-885352fa18c5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:22 crc kubenswrapper[4772]: I0930 17:17:22.954577 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69e18d49-1290-4440-a3c9-885352fa18c5-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-dxbpz\" (UID: \"69e18d49-1290-4440-a3c9-885352fa18c5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.029875 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.131870 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" event={"ID":"832335d3-7446-4879-8ec1-8f24d6d3708a","Type":"ContainerStarted","Data":"2f74a4b3b9573953e0835798a1890152a9a11ff82af17ee8fad4facad74153d2"} Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.134549 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" event={"ID":"44aed112-2ebc-48b6-b3b4-9a47d2dafaa9","Type":"ContainerStarted","Data":"1d77d8d6d73e9ada5b8634c27ceb7be6e3f29b0c75310e590be2cbe41eb64481"} Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.136192 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" event={"ID":"314c8eb1-ee8d-405d-9bb6-a74de21c2f01","Type":"ContainerStarted","Data":"4732f5c21783772bfee4fcfde980dcaa88db719cd90fe60831d9c046367f5fe5"} Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.287821 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.298899 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.309950 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.331393 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.337896 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.346279 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-z472f"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.361162 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.367446 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.377008 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.383312 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.489773 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.500602 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb"] Sep 30 17:17:23 crc kubenswrapper[4772]: E0930 17:17:23.523480 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:e6f1ed6b386f77415c2a44e770d98ab6d16b6f6b494c4d1b4ac4b46368c4a4e6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hc8g7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-7975b88857-vdhkv_openstack-operators(d10d7495-42f5-4919-8985-99913d62ab28): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:17:23 crc kubenswrapper[4772]: W0930 17:17:23.538578 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5be7b91_f881_4cd5_878e_1d40a94a3a8d.slice/crio-e61172b1a50ae9cc913605b12ce634c2771b4a98fd1cbca74a95f7121663ef19 WatchSource:0}: Error finding container e61172b1a50ae9cc913605b12ce634c2771b4a98fd1cbca74a95f7121663ef19: Status 404 returned error can't find the container with id e61172b1a50ae9cc913605b12ce634c2771b4a98fd1cbca74a95f7121663ef19 Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.542036 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p"] Sep 30 17:17:23 crc kubenswrapper[4772]: E0930 17:17:23.547100 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9nvzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6d68dbc695-2fx6p_openstack-operators(c886af64-f9cc-4127-9d17-3007ae492d06): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.565332 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.611241 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.624500 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp"] Sep 30 17:17:23 crc kubenswrapper[4772]: E0930 17:17:23.654457 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pvp4n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-8xbv5_openstack-operators(6c9f85e1-5df7-4943-9064-69af6e200e82): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:17:23 crc kubenswrapper[4772]: E0930 17:17:23.670309 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqd2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-f66b554c6-xmwpp_openstack-operators(5b10f12b-b24a-4cf6-b07b-7b3e811ccd30): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.683581 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc"] Sep 30 17:17:23 crc kubenswrapper[4772]: I0930 17:17:23.719976 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz"] Sep 30 17:17:23 crc kubenswrapper[4772]: W0930 17:17:23.720022 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e8f518a_f6a2_4bfc_a4ed_d6580a97f55f.slice/crio-65016683b50a5cce5099fe561473df772cd9ec57c106f7895a6d30ea9f3b5935 WatchSource:0}: Error finding container 65016683b50a5cce5099fe561473df772cd9ec57c106f7895a6d30ea9f3b5935: Status 404 returned error can't find the container with id 65016683b50a5cce5099fe561473df772cd9ec57c106f7895a6d30ea9f3b5935 Sep 30 17:17:23 crc kubenswrapper[4772]: W0930 17:17:23.800333 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e18d49_1290_4440_a3c9_885352fa18c5.slice/crio-4105c17add64e066f661a50250348fc7386a98b154f1a22fa86574aa11541aee WatchSource:0}: Error finding container 4105c17add64e066f661a50250348fc7386a98b154f1a22fa86574aa11541aee: Status 404 returned error can't find the container with id 4105c17add64e066f661a50250348fc7386a98b154f1a22fa86574aa11541aee Sep 30 17:17:23 crc kubenswrapper[4772]: E0930 17:17:23.807780 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-md5fv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6d776955-dxbpz_openstack-operators(69e18d49-1290-4440-a3c9-885352fa18c5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:17:23 crc kubenswrapper[4772]: E0930 17:17:23.918169 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" podUID="c886af64-f9cc-4127-9d17-3007ae492d06" Sep 30 17:17:23 crc kubenswrapper[4772]: E0930 17:17:23.920546 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" podUID="d10d7495-42f5-4919-8985-99913d62ab28" Sep 30 17:17:24 crc kubenswrapper[4772]: E0930 17:17:24.089519 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" podUID="5b10f12b-b24a-4cf6-b07b-7b3e811ccd30" Sep 30 17:17:24 crc kubenswrapper[4772]: E0930 17:17:24.129243 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" podUID="69e18d49-1290-4440-a3c9-885352fa18c5" Sep 30 17:17:24 crc kubenswrapper[4772]: E0930 17:17:24.183098 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" podUID="6c9f85e1-5df7-4943-9064-69af6e200e82" Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.217576 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc" event={"ID":"1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f","Type":"ContainerStarted","Data":"65016683b50a5cce5099fe561473df772cd9ec57c106f7895a6d30ea9f3b5935"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.219618 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" event={"ID":"1753608a-67af-4fa4-83f1-3f7d1623fc6b","Type":"ContainerStarted","Data":"18d9b8637c3472780c050b0fd6702faa0d720b1d87d8a929286365d129e45938"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.223858 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" event={"ID":"69e18d49-1290-4440-a3c9-885352fa18c5","Type":"ContainerStarted","Data":"8d01b75d819b3d5a99ae085f587919ec3894ad7ae7fed15e95b23fc84a9df1c9"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.223931 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" event={"ID":"69e18d49-1290-4440-a3c9-885352fa18c5","Type":"ContainerStarted","Data":"4105c17add64e066f661a50250348fc7386a98b154f1a22fa86574aa11541aee"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.228003 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" event={"ID":"df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff","Type":"ContainerStarted","Data":"fe47ce66256ffcb5a6797e3d2732d4b533c230795d5543f99cb09ecb938f1ede"} Sep 30 17:17:24 crc kubenswrapper[4772]: E0930 17:17:24.228439 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" podUID="69e18d49-1290-4440-a3c9-885352fa18c5" Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.230319 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" event={"ID":"c886af64-f9cc-4127-9d17-3007ae492d06","Type":"ContainerStarted","Data":"35781e7db846d1be6a25c89738a703196465e5fafd071548c220aeb022671b48"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.230373 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" event={"ID":"c886af64-f9cc-4127-9d17-3007ae492d06","Type":"ContainerStarted","Data":"1edd4bf678d17defa6f46dc5348d22e5ba160cba06b417be583e9fdef82734c3"} Sep 30 17:17:24 crc kubenswrapper[4772]: E0930 17:17:24.232431 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" podUID="c886af64-f9cc-4127-9d17-3007ae492d06" Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.235644 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" event={"ID":"27e94b49-6017-4790-af32-61cdb6c41f2c","Type":"ContainerStarted","Data":"9c46ade58684c6e615810690a7a2e08efa9febbb9c191b544e0df4a4a0e9e380"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.239676 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" event={"ID":"b7ba1160-070d-4cc4-9c53-75817bd6141e","Type":"ContainerStarted","Data":"5001762d599db400d67a307d4cc5f1faafe37bab6fb9044eb7fd5ec55f78ea52"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.249361 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" event={"ID":"51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc","Type":"ContainerStarted","Data":"2d231776e84b97297ba2f520838bd11b9bdbe544b8d69bdfe5531c1ab9439e93"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.282581 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" event={"ID":"6c9f85e1-5df7-4943-9064-69af6e200e82","Type":"ContainerStarted","Data":"96a5d5fb1afb0396e253899f2bc7e1f57d2c3a26f9c2d0cc259c42426ee4aa9c"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.282664 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" event={"ID":"6c9f85e1-5df7-4943-9064-69af6e200e82","Type":"ContainerStarted","Data":"1c8394f3d7daf37ba732ae66b31a55a11640ca72555ab3fa6bfce8b074f4fdd8"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.328639 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" event={"ID":"13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36","Type":"ContainerStarted","Data":"32716f02c3ce2e1584ea46d474a3964bbb4d64c54386875d80414cbae0243de9"} Sep 30 17:17:24 crc kubenswrapper[4772]: E0930 17:17:24.328831 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" podUID="6c9f85e1-5df7-4943-9064-69af6e200e82" Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.357391 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" event={"ID":"4fcd6b42-8644-41f5-bd3b-51184d34cd00","Type":"ContainerStarted","Data":"5d000b1af31a5a458e01a3117b325177290f36e7300cefb380c1e25c469ffff0"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.397369 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" event={"ID":"5b10f12b-b24a-4cf6-b07b-7b3e811ccd30","Type":"ContainerStarted","Data":"d4f33c1f3839f06fc027dc3ca43fa24d19bc84fa66226610ca1b74141cb85a85"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.397419 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" event={"ID":"5b10f12b-b24a-4cf6-b07b-7b3e811ccd30","Type":"ContainerStarted","Data":"e402e7780cafc673cb614d240578bf8cba3d2be54f0363fe4323615a6996383f"} Sep 30 17:17:24 crc kubenswrapper[4772]: E0930 17:17:24.410163 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" podUID="5b10f12b-b24a-4cf6-b07b-7b3e811ccd30" Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.438303 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" event={"ID":"d4295a68-a2dc-4b0b-a577-bbd6448d3a70","Type":"ContainerStarted","Data":"bca45777133cb28799d70e7019e3d59b81ea45971916ea9dfb11ca053658acda"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.471293 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" event={"ID":"f3a0e5a3-c50e-48ce-801d-f7916210165b","Type":"ContainerStarted","Data":"61d6a8e5bfe6d1e9fe9eb15b751c6bf3563157ec2c9d6dd82eb3187a07cd22eb"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.484211 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" event={"ID":"058eec37-9f59-4fc5-8fa3-c9595bf58300","Type":"ContainerStarted","Data":"24382e8783bc3dcebcf7995b770f1854d6778e8dd00bdc9dc593fff6c7157404"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.501382 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" event={"ID":"d10d7495-42f5-4919-8985-99913d62ab28","Type":"ContainerStarted","Data":"e266fe3870d0f5c3d333ab5dfeeb40384f93ff3328fb3bd90436d15f6196d0c9"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.501441 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" event={"ID":"d10d7495-42f5-4919-8985-99913d62ab28","Type":"ContainerStarted","Data":"4d765c67b7548fd9094cf53793481ff66aef2c37fd9f744adaf665f7b8817f0a"} Sep 30 17:17:24 crc kubenswrapper[4772]: E0930 17:17:24.518684 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:e6f1ed6b386f77415c2a44e770d98ab6d16b6f6b494c4d1b4ac4b46368c4a4e6\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" podUID="d10d7495-42f5-4919-8985-99913d62ab28" Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.528285 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" event={"ID":"d5be7b91-f881-4cd5-878e-1d40a94a3a8d","Type":"ContainerStarted","Data":"e93398eff424271eef92091935f36a2ee69ab587fc13aa3300c53002cdb418c8"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.528338 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" event={"ID":"d5be7b91-f881-4cd5-878e-1d40a94a3a8d","Type":"ContainerStarted","Data":"e61172b1a50ae9cc913605b12ce634c2771b4a98fd1cbca74a95f7121663ef19"} Sep 30 17:17:24 crc kubenswrapper[4772]: I0930 17:17:24.547279 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" event={"ID":"f8af3992-c401-4dea-b5a5-92063a05384e","Type":"ContainerStarted","Data":"ce00ebae79b44361735804ae15d01eea6303ce02effebedb3ed8eb0e796817a5"} Sep 30 17:17:25 crc kubenswrapper[4772]: I0930 17:17:25.564175 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" event={"ID":"d5be7b91-f881-4cd5-878e-1d40a94a3a8d","Type":"ContainerStarted","Data":"991737fb3b837c3ccef08b22dde96244ae2fbb5a3fa76298a8fa044cc69a162d"} Sep 30 17:17:25 crc kubenswrapper[4772]: I0930 17:17:25.565598 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:25 crc kubenswrapper[4772]: E0930 17:17:25.568928 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" podUID="6c9f85e1-5df7-4943-9064-69af6e200e82" Sep 30 17:17:25 crc kubenswrapper[4772]: E0930 17:17:25.568933 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:e6f1ed6b386f77415c2a44e770d98ab6d16b6f6b494c4d1b4ac4b46368c4a4e6\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" podUID="d10d7495-42f5-4919-8985-99913d62ab28" Sep 30 17:17:25 crc kubenswrapper[4772]: E0930 17:17:25.575227 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" podUID="c886af64-f9cc-4127-9d17-3007ae492d06" Sep 30 17:17:25 crc kubenswrapper[4772]: E0930 17:17:25.575866 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" podUID="5b10f12b-b24a-4cf6-b07b-7b3e811ccd30" Sep 30 17:17:25 crc kubenswrapper[4772]: E0930 17:17:25.582330 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" podUID="69e18d49-1290-4440-a3c9-885352fa18c5" Sep 30 17:17:25 crc kubenswrapper[4772]: I0930 17:17:25.671001 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" podStartSLOduration=4.67097179 podStartE2EDuration="4.67097179s" podCreationTimestamp="2025-09-30 17:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:17:25.666402742 +0000 UTC m=+946.573415573" watchObservedRunningTime="2025-09-30 17:17:25.67097179 +0000 UTC m=+946.577984631" Sep 30 17:17:32 crc kubenswrapper[4772]: I0930 17:17:32.562454 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5dd9b5767f-p4n9f" Sep 30 17:17:37 crc kubenswrapper[4772]: E0930 17:17:37.538670 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b" Sep 30 17:17:37 crc kubenswrapper[4772]: E0930 17:17:37.542089 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2h6mx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-swgvc_openstack-operators(1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:17:37 crc kubenswrapper[4772]: E0930 17:17:37.543399 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc" podUID="1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f" Sep 30 17:17:37 crc kubenswrapper[4772]: E0930 17:17:37.712252 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc" podUID="1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f" Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.655898 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.655984 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.720148 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" event={"ID":"13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36","Type":"ContainerStarted","Data":"d93d1da71ec777ed500aa6a2b93a60577b576f27dc567f80c0ad28d77f3b92dc"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.726944 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" event={"ID":"27e94b49-6017-4790-af32-61cdb6c41f2c","Type":"ContainerStarted","Data":"d593ddf809fcda27fd081a510765a9aaceae1e33b62be5539756ea7f27aa3529"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.739588 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" event={"ID":"832335d3-7446-4879-8ec1-8f24d6d3708a","Type":"ContainerStarted","Data":"d5b4a86ba04f5fa7a26f09ac7031d829d6b24e6073afbec41bed09173657780e"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.756478 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" event={"ID":"ad2965ed-ed78-4646-97ae-07cce49e8eb1","Type":"ContainerStarted","Data":"354b370b2a4641497e543036008a648bfebf7ff1f89af63a0a7ed62facd69db4"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.765545 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" event={"ID":"1753608a-67af-4fa4-83f1-3f7d1623fc6b","Type":"ContainerStarted","Data":"5c6619641b0e32420d5dd1847f224dfad1bc9b04c798c030039a39ba41a577d4"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.769233 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" event={"ID":"f8af3992-c401-4dea-b5a5-92063a05384e","Type":"ContainerStarted","Data":"373d7982bd4e9a57884991142952af32df1f2af3d65e0ea6d9580cf65578a438"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.772587 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" event={"ID":"314c8eb1-ee8d-405d-9bb6-a74de21c2f01","Type":"ContainerStarted","Data":"a3d3cef588af40b2460584974d9b8f577c30825b004122d3e75f502e1a31eb4b"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.803154 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" event={"ID":"b7ba1160-070d-4cc4-9c53-75817bd6141e","Type":"ContainerStarted","Data":"bf03ac0b5508c25a724e12b5e87b26e7bf82e92933f11e85055606f564505306"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.814159 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" event={"ID":"058eec37-9f59-4fc5-8fa3-c9595bf58300","Type":"ContainerStarted","Data":"aa7bd6fde579390d6739416d054631ef067accd48e41cda16153a0dead398e9d"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.837172 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" event={"ID":"44aed112-2ebc-48b6-b3b4-9a47d2dafaa9","Type":"ContainerStarted","Data":"d68ceaad99da25aa2f554229097b3ac7379707abd0d917315845bb4d6a568cb6"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.867145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" event={"ID":"df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff","Type":"ContainerStarted","Data":"3e0b6fd7dc9917f9f6fc36e1e8486f3e36140129099b3aa7a8e204218943081c"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.903979 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" event={"ID":"d4295a68-a2dc-4b0b-a577-bbd6448d3a70","Type":"ContainerStarted","Data":"322dddd22a71127d68a18e740f139122a7f9bcb088fada4cf2efb1991e60cfcf"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.918407 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" event={"ID":"80d5010e-a767-491b-bcb2-89272762a121","Type":"ContainerStarted","Data":"4af6f84baa6dc3c2992914e46c5ca7c97840133d4762babf54f944e61d1ea035"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.928711 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" event={"ID":"f3a0e5a3-c50e-48ce-801d-f7916210165b","Type":"ContainerStarted","Data":"43993994c099d5a0c81098d532effdf02210aebddddbf4af19ad98c461eeeb71"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.939444 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" event={"ID":"4fcd6b42-8644-41f5-bd3b-51184d34cd00","Type":"ContainerStarted","Data":"1da4af7a411d7af9a3cbb17cebf52b6009be10590100ebc7db3c97127b2a16c9"} Sep 30 17:17:38 crc kubenswrapper[4772]: I0930 17:17:38.973595 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" event={"ID":"51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc","Type":"ContainerStarted","Data":"095af1591734198731dec3fa9372c212163c48d643823b2306b3b6609daf474b"} Sep 30 17:17:39 crc kubenswrapper[4772]: I0930 17:17:39.998116 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" event={"ID":"058eec37-9f59-4fc5-8fa3-c9595bf58300","Type":"ContainerStarted","Data":"38608dd8e565138be573ee5b6de4c3f7d818f6898f6c0909f324a7b6bcad2773"} Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.012951 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" event={"ID":"832335d3-7446-4879-8ec1-8f24d6d3708a","Type":"ContainerStarted","Data":"cc2b2b728b8413b6a021ef6e7e703777bff4288f4ece48d580b0f4649dc628dc"} Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.015583 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.038376 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" event={"ID":"df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff","Type":"ContainerStarted","Data":"0bd8ff26663b864c5386b3a05e957e34fdbcc2c523c465a307b9c77a11a2ba18"} Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.038451 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.051363 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" podStartSLOduration=4.989556812 podStartE2EDuration="20.05133698s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:22.440345379 +0000 UTC m=+943.347358210" lastFinishedPulling="2025-09-30 17:17:37.502125547 +0000 UTC m=+958.409138378" observedRunningTime="2025-09-30 17:17:40.048819575 +0000 UTC m=+960.955832406" watchObservedRunningTime="2025-09-30 17:17:40.05133698 +0000 UTC m=+960.958349811" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.075855 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" event={"ID":"f8af3992-c401-4dea-b5a5-92063a05384e","Type":"ContainerStarted","Data":"f78ab1224392b8e565a98abd1baaeb58768b06e5b9f03d8e29210e99b98f5c6c"} Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.076139 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.078158 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" event={"ID":"314c8eb1-ee8d-405d-9bb6-a74de21c2f01","Type":"ContainerStarted","Data":"fa50f065e9d1380a681b61d1afa36dbfcb6afd93f238a81c7d4c77a095df1ad9"} Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.078349 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.089676 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" event={"ID":"44aed112-2ebc-48b6-b3b4-9a47d2dafaa9","Type":"ContainerStarted","Data":"5f9b19ff4b2edcdec1e3532e847b1299c996ff26dbcfab734e655d43ddb28716"} Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.090425 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.094531 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" event={"ID":"4fcd6b42-8644-41f5-bd3b-51184d34cd00","Type":"ContainerStarted","Data":"84151bbc226af333eb613f2e50f2cefb57760a3e65449c336ae744b58d0f797f"} Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.095362 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.100531 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" event={"ID":"d4295a68-a2dc-4b0b-a577-bbd6448d3a70","Type":"ContainerStarted","Data":"88078d389bf59c31837c94b532ca2590a17a027877de1116343368ac8edd7563"} Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.101167 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.110208 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" event={"ID":"f3a0e5a3-c50e-48ce-801d-f7916210165b","Type":"ContainerStarted","Data":"9077ac7a04808782fe0a0d3cb73790cc9f93ce5d702059252c2fc5c4d78f3cd7"} Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.110579 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.110907 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" podStartSLOduration=5.957621207 podStartE2EDuration="20.110888379s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.345332344 +0000 UTC m=+944.252345175" lastFinishedPulling="2025-09-30 17:17:37.498599516 +0000 UTC m=+958.405612347" observedRunningTime="2025-09-30 17:17:40.075262718 +0000 UTC m=+960.982275549" watchObservedRunningTime="2025-09-30 17:17:40.110888379 +0000 UTC m=+961.017901200" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.111792 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" podStartSLOduration=6.151841336 podStartE2EDuration="20.111786582s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.539158243 +0000 UTC m=+944.446171084" lastFinishedPulling="2025-09-30 17:17:37.499103499 +0000 UTC m=+958.406116330" observedRunningTime="2025-09-30 17:17:40.107946113 +0000 UTC m=+961.014958944" watchObservedRunningTime="2025-09-30 17:17:40.111786582 +0000 UTC m=+961.018799413" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.153858 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" podStartSLOduration=6.02276767 podStartE2EDuration="20.153825948s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.380499753 +0000 UTC m=+944.287512584" lastFinishedPulling="2025-09-30 17:17:37.511558031 +0000 UTC m=+958.418570862" observedRunningTime="2025-09-30 17:17:40.138944884 +0000 UTC m=+961.045957725" watchObservedRunningTime="2025-09-30 17:17:40.153825948 +0000 UTC m=+961.060838779" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.172659 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" podStartSLOduration=5.129692802 podStartE2EDuration="20.172631694s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:22.469254266 +0000 UTC m=+943.376267107" lastFinishedPulling="2025-09-30 17:17:37.512193168 +0000 UTC m=+958.419205999" observedRunningTime="2025-09-30 17:17:40.171445084 +0000 UTC m=+961.078457935" watchObservedRunningTime="2025-09-30 17:17:40.172631694 +0000 UTC m=+961.079644525" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.246987 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" podStartSLOduration=5.565403882 podStartE2EDuration="20.246956115s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:22.820590845 +0000 UTC m=+943.727603676" lastFinishedPulling="2025-09-30 17:17:37.502143068 +0000 UTC m=+958.409155909" observedRunningTime="2025-09-30 17:17:40.209753754 +0000 UTC m=+961.116766585" watchObservedRunningTime="2025-09-30 17:17:40.246956115 +0000 UTC m=+961.153968946" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.272193 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" podStartSLOduration=5.150132011 podStartE2EDuration="19.272159216s" podCreationTimestamp="2025-09-30 17:17:21 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.380164194 +0000 UTC m=+944.287177025" lastFinishedPulling="2025-09-30 17:17:37.502191399 +0000 UTC m=+958.409204230" observedRunningTime="2025-09-30 17:17:40.242389457 +0000 UTC m=+961.149402278" watchObservedRunningTime="2025-09-30 17:17:40.272159216 +0000 UTC m=+961.179172047" Sep 30 17:17:40 crc kubenswrapper[4772]: I0930 17:17:40.276534 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" podStartSLOduration=6.107273424 podStartE2EDuration="20.276519519s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.331280111 +0000 UTC m=+944.238292942" lastFinishedPulling="2025-09-30 17:17:37.500526206 +0000 UTC m=+958.407539037" observedRunningTime="2025-09-30 17:17:40.273163312 +0000 UTC m=+961.180176153" watchObservedRunningTime="2025-09-30 17:17:40.276519519 +0000 UTC m=+961.183532350" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.131009 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" event={"ID":"51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc","Type":"ContainerStarted","Data":"bc951dcb36f8766452a1f2c5dbb6cb9b655f1bb3444c27b3c6433779f51120bf"} Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.131601 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.133662 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" event={"ID":"ad2965ed-ed78-4646-97ae-07cce49e8eb1","Type":"ContainerStarted","Data":"df29981c8142ff3d230ec0ba3a09c523f9d176df001deef9b75e1297767d0de2"} Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.134610 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.137655 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" event={"ID":"1753608a-67af-4fa4-83f1-3f7d1623fc6b","Type":"ContainerStarted","Data":"5318821416a0e208dab9c1bd8193beb62b06f2a21223056b9c7ecefa3e51e1ce"} Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.137743 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.143391 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" event={"ID":"13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36","Type":"ContainerStarted","Data":"8c2360da0a590272aaa7f12ea15f38423c540bc0d778177f94f6c48db7f45cc0"} Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.143489 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.145731 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" event={"ID":"80d5010e-a767-491b-bcb2-89272762a121","Type":"ContainerStarted","Data":"daca416d8c6a169d012d3649454a3748f2618a7e0c500db4100b83adfba9e6c1"} Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.145880 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.152296 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" event={"ID":"b7ba1160-070d-4cc4-9c53-75817bd6141e","Type":"ContainerStarted","Data":"482005dcaa1ac764e961af8838f0333caf26bb6722f90099200162e0e05f40c4"} Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.152831 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.164126 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" event={"ID":"27e94b49-6017-4790-af32-61cdb6c41f2c","Type":"ContainerStarted","Data":"7adcb738c5e47016a9329c931398b2aa1ae09febff617be84ac115323b1ad29b"} Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.164578 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.164619 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.180530 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" podStartSLOduration=7.051134252 podStartE2EDuration="21.180508657s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.380820831 +0000 UTC m=+944.287833662" lastFinishedPulling="2025-09-30 17:17:37.510195216 +0000 UTC m=+958.417208067" observedRunningTime="2025-09-30 17:17:41.17828344 +0000 UTC m=+962.085296271" watchObservedRunningTime="2025-09-30 17:17:41.180508657 +0000 UTC m=+962.087521488" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.185005 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" podStartSLOduration=7.043475324 podStartE2EDuration="21.184990133s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.371638844 +0000 UTC m=+944.278651675" lastFinishedPulling="2025-09-30 17:17:37.513153633 +0000 UTC m=+958.420166484" observedRunningTime="2025-09-30 17:17:41.15818079 +0000 UTC m=+962.065193641" watchObservedRunningTime="2025-09-30 17:17:41.184990133 +0000 UTC m=+962.092002964" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.206719 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" podStartSLOduration=7.01853026 podStartE2EDuration="21.206692704s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.324432424 +0000 UTC m=+944.231445255" lastFinishedPulling="2025-09-30 17:17:37.512594848 +0000 UTC m=+958.419607699" observedRunningTime="2025-09-30 17:17:41.199094298 +0000 UTC m=+962.106107129" watchObservedRunningTime="2025-09-30 17:17:41.206692704 +0000 UTC m=+962.113705535" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.225039 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" podStartSLOduration=5.6252986499999995 podStartE2EDuration="21.225019128s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:21.913187968 +0000 UTC m=+942.820200799" lastFinishedPulling="2025-09-30 17:17:37.512908446 +0000 UTC m=+958.419921277" observedRunningTime="2025-09-30 17:17:41.21852434 +0000 UTC m=+962.125537171" watchObservedRunningTime="2025-09-30 17:17:41.225019128 +0000 UTC m=+962.132031959" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.240846 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" podStartSLOduration=7.100792466 podStartE2EDuration="21.240826646s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.370438603 +0000 UTC m=+944.277451424" lastFinishedPulling="2025-09-30 17:17:37.510472773 +0000 UTC m=+958.417485604" observedRunningTime="2025-09-30 17:17:41.235533039 +0000 UTC m=+962.142545890" watchObservedRunningTime="2025-09-30 17:17:41.240826646 +0000 UTC m=+962.147839477" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.260228 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" podStartSLOduration=5.772944105 podStartE2EDuration="21.260203917s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:22.025673025 +0000 UTC m=+942.932685856" lastFinishedPulling="2025-09-30 17:17:37.512932837 +0000 UTC m=+958.419945668" observedRunningTime="2025-09-30 17:17:41.250031214 +0000 UTC m=+962.157044055" watchObservedRunningTime="2025-09-30 17:17:41.260203917 +0000 UTC m=+962.167216748" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.270170 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" podStartSLOduration=7.097440529 podStartE2EDuration="21.270146944s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.336347082 +0000 UTC m=+944.243359913" lastFinishedPulling="2025-09-30 17:17:37.509053497 +0000 UTC m=+958.416066328" observedRunningTime="2025-09-30 17:17:41.265940475 +0000 UTC m=+962.172953316" watchObservedRunningTime="2025-09-30 17:17:41.270146944 +0000 UTC m=+962.177159775" Sep 30 17:17:41 crc kubenswrapper[4772]: I0930 17:17:41.284564 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" podStartSLOduration=7.107407927 podStartE2EDuration="21.284544396s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.323623453 +0000 UTC m=+944.230636284" lastFinishedPulling="2025-09-30 17:17:37.500759912 +0000 UTC m=+958.407772753" observedRunningTime="2025-09-30 17:17:41.282198125 +0000 UTC m=+962.189210966" watchObservedRunningTime="2025-09-30 17:17:41.284544396 +0000 UTC m=+962.191557227" Sep 30 17:17:42 crc kubenswrapper[4772]: I0930 17:17:42.176741 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-b6np7" Sep 30 17:17:43 crc kubenswrapper[4772]: I0930 17:17:43.181192 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-b82x8" Sep 30 17:17:43 crc kubenswrapper[4772]: I0930 17:17:43.181627 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-dn2kt" Sep 30 17:17:43 crc kubenswrapper[4772]: I0930 17:17:43.182137 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-n7w7p" Sep 30 17:17:50 crc kubenswrapper[4772]: E0930 17:17:50.564645 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884" Sep 30 17:17:50 crc kubenswrapper[4772]: E0930 17:17:50.565102 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9nvzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6d68dbc695-2fx6p_openstack-operators(c886af64-f9cc-4127-9d17-3007ae492d06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:17:50 crc kubenswrapper[4772]: E0930 17:17:50.566270 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" podUID="c886af64-f9cc-4127-9d17-3007ae492d06" Sep 30 17:17:50 crc kubenswrapper[4772]: I0930 17:17:50.882218 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-tzz8m" Sep 30 17:17:50 crc kubenswrapper[4772]: I0930 17:17:50.919818 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-rqv96" Sep 30 17:17:50 crc kubenswrapper[4772]: I0930 17:17:50.951091 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-ghtj2" Sep 30 17:17:51 crc kubenswrapper[4772]: E0930 17:17:51.066208 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302" Sep 30 17:17:51 crc kubenswrapper[4772]: E0930 17:17:51.066482 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pvp4n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-8xbv5_openstack-operators(6c9f85e1-5df7-4943-9064-69af6e200e82): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:17:51 crc kubenswrapper[4772]: E0930 17:17:51.067693 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" podUID="6c9f85e1-5df7-4943-9064-69af6e200e82" Sep 30 17:17:51 crc kubenswrapper[4772]: I0930 17:17:51.095165 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-56vtc" Sep 30 17:17:51 crc kubenswrapper[4772]: I0930 17:17:51.328085 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vlpqf" Sep 30 17:17:51 crc kubenswrapper[4772]: I0930 17:17:51.373778 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-smllw" Sep 30 17:17:51 crc kubenswrapper[4772]: I0930 17:17:51.379021 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-z472f" Sep 30 17:17:51 crc kubenswrapper[4772]: I0930 17:17:51.404357 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-kpr6v" Sep 30 17:17:51 crc kubenswrapper[4772]: I0930 17:17:51.550487 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mfnlh" Sep 30 17:17:51 crc kubenswrapper[4772]: E0930 17:17:51.704589 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:e6f1ed6b386f77415c2a44e770d98ab6d16b6f6b494c4d1b4ac4b46368c4a4e6" Sep 30 17:17:51 crc kubenswrapper[4772]: E0930 17:17:51.704781 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:e6f1ed6b386f77415c2a44e770d98ab6d16b6f6b494c4d1b4ac4b46368c4a4e6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hc8g7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-7975b88857-vdhkv_openstack-operators(d10d7495-42f5-4919-8985-99913d62ab28): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:17:51 crc kubenswrapper[4772]: E0930 17:17:51.706333 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" podUID="d10d7495-42f5-4919-8985-99913d62ab28" Sep 30 17:17:51 crc kubenswrapper[4772]: I0930 17:17:51.727217 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-shhhk" Sep 30 17:17:51 crc kubenswrapper[4772]: I0930 17:17:51.878260 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-6m9mb" Sep 30 17:17:52 crc kubenswrapper[4772]: I0930 17:17:52.025801 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-86c75f6bd4-4fnzg" Sep 30 17:17:52 crc kubenswrapper[4772]: I0930 17:17:52.253156 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" event={"ID":"5b10f12b-b24a-4cf6-b07b-7b3e811ccd30","Type":"ContainerStarted","Data":"66633ee76f892c93675464971a38df535474222d6767a5ec080a565ea7a1576a"} Sep 30 17:17:52 crc kubenswrapper[4772]: I0930 17:17:52.253442 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" Sep 30 17:17:52 crc kubenswrapper[4772]: I0930 17:17:52.255982 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc" event={"ID":"1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f","Type":"ContainerStarted","Data":"f24bb48af58684879affadad76d48e8ebb60071d13aab8a9266846676994b8e2"} Sep 30 17:17:52 crc kubenswrapper[4772]: I0930 17:17:52.264558 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" event={"ID":"69e18d49-1290-4440-a3c9-885352fa18c5","Type":"ContainerStarted","Data":"44ce648e50201ae1fdc50aadb9d3ece835e7e5b32207a3a4b958b71fb73b4c86"} Sep 30 17:17:52 crc kubenswrapper[4772]: I0930 17:17:52.264980 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:17:52 crc kubenswrapper[4772]: I0930 17:17:52.274498 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" podStartSLOduration=3.235764154 podStartE2EDuration="31.274475648s" podCreationTimestamp="2025-09-30 17:17:21 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.670131097 +0000 UTC m=+944.577143928" lastFinishedPulling="2025-09-30 17:17:51.708842591 +0000 UTC m=+972.615855422" observedRunningTime="2025-09-30 17:17:52.270311731 +0000 UTC m=+973.177324562" watchObservedRunningTime="2025-09-30 17:17:52.274475648 +0000 UTC m=+973.181488479" Sep 30 17:17:52 crc kubenswrapper[4772]: I0930 17:17:52.315475 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-swgvc" podStartSLOduration=3.25817545 podStartE2EDuration="31.315450253s" podCreationTimestamp="2025-09-30 17:17:21 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.727254433 +0000 UTC m=+944.634267264" lastFinishedPulling="2025-09-30 17:17:51.784529246 +0000 UTC m=+972.691542067" observedRunningTime="2025-09-30 17:17:52.312352643 +0000 UTC m=+973.219365474" watchObservedRunningTime="2025-09-30 17:17:52.315450253 +0000 UTC m=+973.222463074" Sep 30 17:17:52 crc kubenswrapper[4772]: I0930 17:17:52.318202 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" podStartSLOduration=4.412107357 podStartE2EDuration="32.318189084s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.807321872 +0000 UTC m=+944.714334703" lastFinishedPulling="2025-09-30 17:17:51.713403599 +0000 UTC m=+972.620416430" observedRunningTime="2025-09-30 17:17:52.298166198 +0000 UTC m=+973.205179039" watchObservedRunningTime="2025-09-30 17:17:52.318189084 +0000 UTC m=+973.225201915" Sep 30 17:18:01 crc kubenswrapper[4772]: E0930 17:18:01.901684 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" podUID="6c9f85e1-5df7-4943-9064-69af6e200e82" Sep 30 17:18:01 crc kubenswrapper[4772]: E0930 17:18:01.901894 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" podUID="c886af64-f9cc-4127-9d17-3007ae492d06" Sep 30 17:18:01 crc kubenswrapper[4772]: I0930 17:18:01.921499 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-xmwpp" Sep 30 17:18:03 crc kubenswrapper[4772]: I0930 17:18:03.035961 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-dxbpz" Sep 30 17:18:05 crc kubenswrapper[4772]: E0930 17:18:05.900550 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:e6f1ed6b386f77415c2a44e770d98ab6d16b6f6b494c4d1b4ac4b46368c4a4e6\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" podUID="d10d7495-42f5-4919-8985-99913d62ab28" Sep 30 17:18:08 crc kubenswrapper[4772]: I0930 17:18:08.656952 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:18:08 crc kubenswrapper[4772]: I0930 17:18:08.658109 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:18:08 crc kubenswrapper[4772]: I0930 17:18:08.658267 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:18:08 crc kubenswrapper[4772]: I0930 17:18:08.659146 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abe03f1cdb5c96e46a9cb2863de12ede67a8becb76c4e1cb373ac762e5589161"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:18:08 crc kubenswrapper[4772]: I0930 17:18:08.659321 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://abe03f1cdb5c96e46a9cb2863de12ede67a8becb76c4e1cb373ac762e5589161" gracePeriod=600 Sep 30 17:18:09 crc kubenswrapper[4772]: I0930 17:18:09.381610 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="abe03f1cdb5c96e46a9cb2863de12ede67a8becb76c4e1cb373ac762e5589161" exitCode=0 Sep 30 17:18:09 crc kubenswrapper[4772]: I0930 17:18:09.381657 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"abe03f1cdb5c96e46a9cb2863de12ede67a8becb76c4e1cb373ac762e5589161"} Sep 30 17:18:09 crc kubenswrapper[4772]: I0930 17:18:09.381959 4772 scope.go:117] "RemoveContainer" containerID="9c4130d132bd9ba1e58ca9105011cc1089aeabb461da2027bde96f24d0137622" Sep 30 17:18:11 crc kubenswrapper[4772]: I0930 17:18:11.400441 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"efa0334e5be43d3bffa768f2acb0e43691dcf91743c608a3a66ab0007419afd9"} Sep 30 17:18:14 crc kubenswrapper[4772]: I0930 17:18:14.423669 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" event={"ID":"c886af64-f9cc-4127-9d17-3007ae492d06","Type":"ContainerStarted","Data":"2c5ec57301647709a97f6391e08e280ffa087e0a9cf4931140b34542c0c21ad6"} Sep 30 17:18:14 crc kubenswrapper[4772]: I0930 17:18:14.425626 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" Sep 30 17:18:14 crc kubenswrapper[4772]: I0930 17:18:14.426885 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" event={"ID":"6c9f85e1-5df7-4943-9064-69af6e200e82","Type":"ContainerStarted","Data":"0e20181557f7822cf410570b6cf3447cdd314bc9f241a0b3bcfaad034a00ec64"} Sep 30 17:18:14 crc kubenswrapper[4772]: I0930 17:18:14.427089 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" Sep 30 17:18:14 crc kubenswrapper[4772]: I0930 17:18:14.444992 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" podStartSLOduration=4.445959038 podStartE2EDuration="54.444971913s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.546806241 +0000 UTC m=+944.453819072" lastFinishedPulling="2025-09-30 17:18:13.545819076 +0000 UTC m=+994.452831947" observedRunningTime="2025-09-30 17:18:14.439641256 +0000 UTC m=+995.346654087" watchObservedRunningTime="2025-09-30 17:18:14.444971913 +0000 UTC m=+995.351984744" Sep 30 17:18:14 crc kubenswrapper[4772]: I0930 17:18:14.458256 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" podStartSLOduration=4.568110952 podStartE2EDuration="54.458233674s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.654279668 +0000 UTC m=+944.561292499" lastFinishedPulling="2025-09-30 17:18:13.54440239 +0000 UTC m=+994.451415221" observedRunningTime="2025-09-30 17:18:14.457545967 +0000 UTC m=+995.364558808" watchObservedRunningTime="2025-09-30 17:18:14.458233674 +0000 UTC m=+995.365246505" Sep 30 17:18:20 crc kubenswrapper[4772]: I0930 17:18:20.465180 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" event={"ID":"d10d7495-42f5-4919-8985-99913d62ab28","Type":"ContainerStarted","Data":"3722e051caa8edbf1a6afeafe8c0e451351f1df6f62e8e0573e1be3c426d4fdd"} Sep 30 17:18:20 crc kubenswrapper[4772]: I0930 17:18:20.466342 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" Sep 30 17:18:20 crc kubenswrapper[4772]: I0930 17:18:20.484192 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" podStartSLOduration=4.279155111 podStartE2EDuration="1m0.484176007s" podCreationTimestamp="2025-09-30 17:17:20 +0000 UTC" firstStartedPulling="2025-09-30 17:17:23.52317173 +0000 UTC m=+944.430184561" lastFinishedPulling="2025-09-30 17:18:19.728192626 +0000 UTC m=+1000.635205457" observedRunningTime="2025-09-30 17:18:20.481160049 +0000 UTC m=+1001.388172880" watchObservedRunningTime="2025-09-30 17:18:20.484176007 +0000 UTC m=+1001.391188838" Sep 30 17:18:21 crc kubenswrapper[4772]: I0930 17:18:21.425894 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-8xbv5" Sep 30 17:18:21 crc kubenswrapper[4772]: I0930 17:18:21.438857 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2fx6p" Sep 30 17:18:31 crc kubenswrapper[4772]: I0930 17:18:31.362708 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-vdhkv" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.404224 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b4d96f67c-ffhdn"] Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.406042 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.410081 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.410334 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.410816 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.410946 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-nrvzs" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.417960 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b4d96f67c-ffhdn"] Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.479796 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7978bb4dbf-4fnjk"] Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.490600 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7978bb4dbf-4fnjk"] Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.490708 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.493724 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.554768 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-dns-svc\") pod \"dnsmasq-dns-7978bb4dbf-4fnjk\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.554839 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-config\") pod \"dnsmasq-dns-7978bb4dbf-4fnjk\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.554891 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bb9cf8-6565-4414-b8db-da6be55dda45-config\") pod \"dnsmasq-dns-6b4d96f67c-ffhdn\" (UID: \"f6bb9cf8-6565-4414-b8db-da6be55dda45\") " pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.554923 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jx69\" (UniqueName: \"kubernetes.io/projected/11e271ca-2800-4cd9-872c-da0a0bab2298-kube-api-access-6jx69\") pod \"dnsmasq-dns-7978bb4dbf-4fnjk\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.555005 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvm7\" (UniqueName: \"kubernetes.io/projected/f6bb9cf8-6565-4414-b8db-da6be55dda45-kube-api-access-wrvm7\") pod \"dnsmasq-dns-6b4d96f67c-ffhdn\" (UID: \"f6bb9cf8-6565-4414-b8db-da6be55dda45\") " pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.656733 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-dns-svc\") pod \"dnsmasq-dns-7978bb4dbf-4fnjk\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.656809 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-config\") pod \"dnsmasq-dns-7978bb4dbf-4fnjk\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.656852 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bb9cf8-6565-4414-b8db-da6be55dda45-config\") pod \"dnsmasq-dns-6b4d96f67c-ffhdn\" (UID: \"f6bb9cf8-6565-4414-b8db-da6be55dda45\") " pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.656882 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jx69\" (UniqueName: \"kubernetes.io/projected/11e271ca-2800-4cd9-872c-da0a0bab2298-kube-api-access-6jx69\") pod \"dnsmasq-dns-7978bb4dbf-4fnjk\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.656912 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvm7\" (UniqueName: \"kubernetes.io/projected/f6bb9cf8-6565-4414-b8db-da6be55dda45-kube-api-access-wrvm7\") pod \"dnsmasq-dns-6b4d96f67c-ffhdn\" (UID: \"f6bb9cf8-6565-4414-b8db-da6be55dda45\") " pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.657746 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-dns-svc\") pod \"dnsmasq-dns-7978bb4dbf-4fnjk\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.657794 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-config\") pod \"dnsmasq-dns-7978bb4dbf-4fnjk\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.658246 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bb9cf8-6565-4414-b8db-da6be55dda45-config\") pod \"dnsmasq-dns-6b4d96f67c-ffhdn\" (UID: \"f6bb9cf8-6565-4414-b8db-da6be55dda45\") " pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.674869 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvm7\" (UniqueName: \"kubernetes.io/projected/f6bb9cf8-6565-4414-b8db-da6be55dda45-kube-api-access-wrvm7\") pod \"dnsmasq-dns-6b4d96f67c-ffhdn\" (UID: \"f6bb9cf8-6565-4414-b8db-da6be55dda45\") " pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.675559 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jx69\" (UniqueName: \"kubernetes.io/projected/11e271ca-2800-4cd9-872c-da0a0bab2298-kube-api-access-6jx69\") pod \"dnsmasq-dns-7978bb4dbf-4fnjk\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.730400 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" Sep 30 17:18:51 crc kubenswrapper[4772]: I0930 17:18:51.805495 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:18:52 crc kubenswrapper[4772]: I0930 17:18:52.137186 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7978bb4dbf-4fnjk"] Sep 30 17:18:52 crc kubenswrapper[4772]: I0930 17:18:52.254787 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b4d96f67c-ffhdn"] Sep 30 17:18:52 crc kubenswrapper[4772]: W0930 17:18:52.262218 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6bb9cf8_6565_4414_b8db_da6be55dda45.slice/crio-f81cfc5c0cc9bf318b4a96fdb68fc31b0341949edac1455ecb86af5229013e4b WatchSource:0}: Error finding container f81cfc5c0cc9bf318b4a96fdb68fc31b0341949edac1455ecb86af5229013e4b: Status 404 returned error can't find the container with id f81cfc5c0cc9bf318b4a96fdb68fc31b0341949edac1455ecb86af5229013e4b Sep 30 17:18:52 crc kubenswrapper[4772]: I0930 17:18:52.721403 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" event={"ID":"f6bb9cf8-6565-4414-b8db-da6be55dda45","Type":"ContainerStarted","Data":"f81cfc5c0cc9bf318b4a96fdb68fc31b0341949edac1455ecb86af5229013e4b"} Sep 30 17:18:52 crc kubenswrapper[4772]: I0930 17:18:52.722406 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" event={"ID":"11e271ca-2800-4cd9-872c-da0a0bab2298","Type":"ContainerStarted","Data":"cf3f4e3d5358f08b820a4bb1bad689bbfc10d2774020f1d16764aff8527d6692"} Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.669507 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b4d96f67c-ffhdn"] Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.693407 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9c554dd7c-kbhsn"] Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.699647 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.723948 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9c554dd7c-kbhsn"] Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.823743 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-config\") pod \"dnsmasq-dns-9c554dd7c-kbhsn\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.823851 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcqkf\" (UniqueName: \"kubernetes.io/projected/b107f703-007f-41fa-8b85-fabdaa5da089-kube-api-access-mcqkf\") pod \"dnsmasq-dns-9c554dd7c-kbhsn\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.823876 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-dns-svc\") pod \"dnsmasq-dns-9c554dd7c-kbhsn\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.944645 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-config\") pod \"dnsmasq-dns-9c554dd7c-kbhsn\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.945028 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcqkf\" (UniqueName: \"kubernetes.io/projected/b107f703-007f-41fa-8b85-fabdaa5da089-kube-api-access-mcqkf\") pod \"dnsmasq-dns-9c554dd7c-kbhsn\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.945082 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-dns-svc\") pod \"dnsmasq-dns-9c554dd7c-kbhsn\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.946044 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-dns-svc\") pod \"dnsmasq-dns-9c554dd7c-kbhsn\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.946655 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-config\") pod \"dnsmasq-dns-9c554dd7c-kbhsn\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:55 crc kubenswrapper[4772]: I0930 17:18:55.998481 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7978bb4dbf-4fnjk"] Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.000791 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcqkf\" (UniqueName: \"kubernetes.io/projected/b107f703-007f-41fa-8b85-fabdaa5da089-kube-api-access-mcqkf\") pod \"dnsmasq-dns-9c554dd7c-kbhsn\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.030661 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79d46b9689-d9snl"] Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.037744 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.041127 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.047967 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79d46b9689-d9snl"] Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.159867 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hthx\" (UniqueName: \"kubernetes.io/projected/1cec6569-cbbf-433a-ac10-c314faf1f80f-kube-api-access-7hthx\") pod \"dnsmasq-dns-79d46b9689-d9snl\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.159959 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-config\") pod \"dnsmasq-dns-79d46b9689-d9snl\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.160104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-dns-svc\") pod \"dnsmasq-dns-79d46b9689-d9snl\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.261609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-config\") pod \"dnsmasq-dns-79d46b9689-d9snl\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.262104 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-dns-svc\") pod \"dnsmasq-dns-79d46b9689-d9snl\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.262162 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hthx\" (UniqueName: \"kubernetes.io/projected/1cec6569-cbbf-433a-ac10-c314faf1f80f-kube-api-access-7hthx\") pod \"dnsmasq-dns-79d46b9689-d9snl\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.262839 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-config\") pod \"dnsmasq-dns-79d46b9689-d9snl\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.263154 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-dns-svc\") pod \"dnsmasq-dns-79d46b9689-d9snl\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.313626 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hthx\" (UniqueName: \"kubernetes.io/projected/1cec6569-cbbf-433a-ac10-c314faf1f80f-kube-api-access-7hthx\") pod \"dnsmasq-dns-79d46b9689-d9snl\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.356328 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79d46b9689-d9snl"] Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.356876 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.390016 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77fb6b9747-wl7s4"] Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.391336 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.399305 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77fb6b9747-wl7s4"] Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.465757 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-config\") pod \"dnsmasq-dns-77fb6b9747-wl7s4\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.465906 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-dns-svc\") pod \"dnsmasq-dns-77fb6b9747-wl7s4\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.466003 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7brqc\" (UniqueName: \"kubernetes.io/projected/a1ccfbe9-8a91-4864-abbd-876484b84d92-kube-api-access-7brqc\") pod \"dnsmasq-dns-77fb6b9747-wl7s4\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.571470 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-dns-svc\") pod \"dnsmasq-dns-77fb6b9747-wl7s4\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.571549 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7brqc\" (UniqueName: \"kubernetes.io/projected/a1ccfbe9-8a91-4864-abbd-876484b84d92-kube-api-access-7brqc\") pod \"dnsmasq-dns-77fb6b9747-wl7s4\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.571606 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-config\") pod \"dnsmasq-dns-77fb6b9747-wl7s4\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.572630 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-config\") pod \"dnsmasq-dns-77fb6b9747-wl7s4\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.573398 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-dns-svc\") pod \"dnsmasq-dns-77fb6b9747-wl7s4\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.627007 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7brqc\" (UniqueName: \"kubernetes.io/projected/a1ccfbe9-8a91-4864-abbd-876484b84d92-kube-api-access-7brqc\") pod \"dnsmasq-dns-77fb6b9747-wl7s4\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.736850 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.781767 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9c554dd7c-kbhsn"] Sep 30 17:18:56 crc kubenswrapper[4772]: W0930 17:18:56.806496 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb107f703_007f_41fa_8b85_fabdaa5da089.slice/crio-8b5cbc5686a9b686fdede061ab39dbd826f67800831a2f42dd0df6fd4f47b28b WatchSource:0}: Error finding container 8b5cbc5686a9b686fdede061ab39dbd826f67800831a2f42dd0df6fd4f47b28b: Status 404 returned error can't find the container with id 8b5cbc5686a9b686fdede061ab39dbd826f67800831a2f42dd0df6fd4f47b28b Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.863037 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.864510 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.867037 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.867558 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.867776 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-sp2lt" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.867922 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.868071 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.868240 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.868377 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.880670 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.983759 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdc29\" (UniqueName: \"kubernetes.io/projected/607217cf-8f90-4adb-bca7-0271ea8a7b9b-kube-api-access-sdc29\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.983836 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/607217cf-8f90-4adb-bca7-0271ea8a7b9b-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.983885 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/607217cf-8f90-4adb-bca7-0271ea8a7b9b-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.984343 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/607217cf-8f90-4adb-bca7-0271ea8a7b9b-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.984375 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.984426 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.984731 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.984842 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/607217cf-8f90-4adb-bca7-0271ea8a7b9b-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.984925 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/607217cf-8f90-4adb-bca7-0271ea8a7b9b-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.984974 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:56 crc kubenswrapper[4772]: I0930 17:18:56.985234 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086541 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/607217cf-8f90-4adb-bca7-0271ea8a7b9b-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086597 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/607217cf-8f90-4adb-bca7-0271ea8a7b9b-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086618 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086648 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086678 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdc29\" (UniqueName: \"kubernetes.io/projected/607217cf-8f90-4adb-bca7-0271ea8a7b9b-kube-api-access-sdc29\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086708 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/607217cf-8f90-4adb-bca7-0271ea8a7b9b-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086756 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/607217cf-8f90-4adb-bca7-0271ea8a7b9b-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086812 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/607217cf-8f90-4adb-bca7-0271ea8a7b9b-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086860 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.086950 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.087374 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.087928 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.088007 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.088724 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/607217cf-8f90-4adb-bca7-0271ea8a7b9b-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.088794 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/607217cf-8f90-4adb-bca7-0271ea8a7b9b-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.089150 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/607217cf-8f90-4adb-bca7-0271ea8a7b9b-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.093022 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/607217cf-8f90-4adb-bca7-0271ea8a7b9b-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.093159 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.107326 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/607217cf-8f90-4adb-bca7-0271ea8a7b9b-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.122346 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79d46b9689-d9snl"] Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.127893 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdc29\" (UniqueName: \"kubernetes.io/projected/607217cf-8f90-4adb-bca7-0271ea8a7b9b-kube-api-access-sdc29\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.136690 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.153438 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/607217cf-8f90-4adb-bca7-0271ea8a7b9b-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"607217cf-8f90-4adb-bca7-0271ea8a7b9b\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.178514 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.196236 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.200549 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.205146 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.205459 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rpsdz" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.205620 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.205801 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.205929 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.206493 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.209576 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.224007 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.291291 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.291347 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.291364 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0788e86-24b4-421d-98c9-12f0a8e52740-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.291393 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.291443 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.291975 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.292000 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.292027 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0788e86-24b4-421d-98c9-12f0a8e52740-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.292080 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-942vw\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-kube-api-access-942vw\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.292104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.292436 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-config-data\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.336842 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77fb6b9747-wl7s4"] Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.394139 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.394364 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.394310 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.395485 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.398243 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.398289 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0788e86-24b4-421d-98c9-12f0a8e52740-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.398339 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-942vw\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-kube-api-access-942vw\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.398359 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.398416 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-config-data\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.398449 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.398500 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.398515 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0788e86-24b4-421d-98c9-12f0a8e52740-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.398563 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.401299 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.401348 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.402135 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-config-data\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.402505 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.407554 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.407759 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0788e86-24b4-421d-98c9-12f0a8e52740-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.415015 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.419959 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.422329 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0788e86-24b4-421d-98c9-12f0a8e52740-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.422975 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-942vw\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-kube-api-access-942vw\") pod \"rabbitmq-server-0\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.547531 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.585238 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.586961 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.589994 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.591977 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.592018 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.592518 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.592577 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.592658 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wlgst" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.592696 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.602578 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703110 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703183 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703210 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703291 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9e90f254-e3e7-4c4f-acfe-1a251e7682df-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703307 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703332 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9e90f254-e3e7-4c4f-acfe-1a251e7682df-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703374 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703407 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703687 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703925 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.703964 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9tk5\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-kube-api-access-r9tk5\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807147 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807502 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9tk5\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-kube-api-access-r9tk5\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807533 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807559 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807594 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807623 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9e90f254-e3e7-4c4f-acfe-1a251e7682df-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807648 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807692 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9e90f254-e3e7-4c4f-acfe-1a251e7682df-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807729 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807805 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.807929 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.808143 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.808195 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.808216 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.808760 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.808942 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.816920 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.818777 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9e90f254-e3e7-4c4f-acfe-1a251e7682df-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.819979 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9e90f254-e3e7-4c4f-acfe-1a251e7682df-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.820219 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.829456 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9tk5\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-kube-api-access-r9tk5\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.830325 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.837262 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d46b9689-d9snl" event={"ID":"1cec6569-cbbf-433a-ac10-c314faf1f80f","Type":"ContainerStarted","Data":"e505a04d65dc76982ea4c1cb2b290a296a25bcc049522025f1928b38218b074f"} Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.842130 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" event={"ID":"a1ccfbe9-8a91-4864-abbd-876484b84d92","Type":"ContainerStarted","Data":"9be6eff1e1cb7c6bbbae679ad683a4fa35b7570ddafdbbfbd7e94e6fee8bafdf"} Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.847918 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" event={"ID":"b107f703-007f-41fa-8b85-fabdaa5da089","Type":"ContainerStarted","Data":"8b5cbc5686a9b686fdede061ab39dbd826f67800831a2f42dd0df6fd4f47b28b"} Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.850501 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:57 crc kubenswrapper[4772]: I0930 17:18:57.924890 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:18:58 crc kubenswrapper[4772]: I0930 17:18:58.145503 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:18:58 crc kubenswrapper[4772]: I0930 17:18:58.317827 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:18:58 crc kubenswrapper[4772]: W0930 17:18:58.342494 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e90f254_e3e7_4c4f_acfe_1a251e7682df.slice/crio-b711f24cb10db43dd7ebc534e4b0d335b794329e1d4a97fa1f0d885d47c9a8d9 WatchSource:0}: Error finding container b711f24cb10db43dd7ebc534e4b0d335b794329e1d4a97fa1f0d885d47c9a8d9: Status 404 returned error can't find the container with id b711f24cb10db43dd7ebc534e4b0d335b794329e1d4a97fa1f0d885d47c9a8d9 Sep 30 17:18:58 crc kubenswrapper[4772]: I0930 17:18:58.419803 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 17:18:58 crc kubenswrapper[4772]: W0930 17:18:58.427383 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607217cf_8f90_4adb_bca7_0271ea8a7b9b.slice/crio-1634aecaf60fabc3e3e5a223cfd1f3a4ed16209c6b69e1a0602ddadc24b7639f WatchSource:0}: Error finding container 1634aecaf60fabc3e3e5a223cfd1f3a4ed16209c6b69e1a0602ddadc24b7639f: Status 404 returned error can't find the container with id 1634aecaf60fabc3e3e5a223cfd1f3a4ed16209c6b69e1a0602ddadc24b7639f Sep 30 17:18:58 crc kubenswrapper[4772]: I0930 17:18:58.860398 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9e90f254-e3e7-4c4f-acfe-1a251e7682df","Type":"ContainerStarted","Data":"b711f24cb10db43dd7ebc534e4b0d335b794329e1d4a97fa1f0d885d47c9a8d9"} Sep 30 17:18:58 crc kubenswrapper[4772]: I0930 17:18:58.862102 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c0788e86-24b4-421d-98c9-12f0a8e52740","Type":"ContainerStarted","Data":"d5d75b166cc3d246f3ca13757503f1c51a46835d787fa2f1696806213ba9a783"} Sep 30 17:18:58 crc kubenswrapper[4772]: I0930 17:18:58.864153 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"607217cf-8f90-4adb-bca7-0271ea8a7b9b","Type":"ContainerStarted","Data":"1634aecaf60fabc3e3e5a223cfd1f3a4ed16209c6b69e1a0602ddadc24b7639f"} Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.248774 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.251930 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.256531 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.257133 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.257451 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.257741 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-dq2v6" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.258350 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.273562 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.280744 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.367212 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5548eec2-33be-42b2-9b84-572236f095db-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.367270 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5548eec2-33be-42b2-9b84-572236f095db-secrets\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.367293 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5548eec2-33be-42b2-9b84-572236f095db-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.367333 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5548eec2-33be-42b2-9b84-572236f095db-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.367361 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5548eec2-33be-42b2-9b84-572236f095db-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.367385 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.367399 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5548eec2-33be-42b2-9b84-572236f095db-kolla-config\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.367434 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnwdd\" (UniqueName: \"kubernetes.io/projected/5548eec2-33be-42b2-9b84-572236f095db-kube-api-access-tnwdd\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.367456 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5548eec2-33be-42b2-9b84-572236f095db-config-data-default\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.387820 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.397649 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.404073 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.405379 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.405496 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xqx6l" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.405508 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.405741 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.468484 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b4b3176-3882-486d-8217-54f429906f49-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.468528 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnwdd\" (UniqueName: \"kubernetes.io/projected/5548eec2-33be-42b2-9b84-572236f095db-kube-api-access-tnwdd\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.468551 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4b3176-3882-486d-8217-54f429906f49-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.468753 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5548eec2-33be-42b2-9b84-572236f095db-config-data-default\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.468846 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5548eec2-33be-42b2-9b84-572236f095db-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.468876 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4b4b3176-3882-486d-8217-54f429906f49-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.468908 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.468935 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5548eec2-33be-42b2-9b84-572236f095db-secrets\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.468961 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5548eec2-33be-42b2-9b84-572236f095db-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.469043 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5548eec2-33be-42b2-9b84-572236f095db-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.469111 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgr8l\" (UniqueName: \"kubernetes.io/projected/4b4b3176-3882-486d-8217-54f429906f49-kube-api-access-wgr8l\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.469167 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5548eec2-33be-42b2-9b84-572236f095db-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.469192 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b4b3176-3882-486d-8217-54f429906f49-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.469209 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4b3176-3882-486d-8217-54f429906f49-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.469243 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.469266 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5548eec2-33be-42b2-9b84-572236f095db-kolla-config\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.469309 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4b4b3176-3882-486d-8217-54f429906f49-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.469333 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4b4b3176-3882-486d-8217-54f429906f49-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.469687 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5548eec2-33be-42b2-9b84-572236f095db-config-data-default\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.470264 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5548eec2-33be-42b2-9b84-572236f095db-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.470335 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.470781 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5548eec2-33be-42b2-9b84-572236f095db-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.472957 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5548eec2-33be-42b2-9b84-572236f095db-kolla-config\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.475788 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5548eec2-33be-42b2-9b84-572236f095db-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.477273 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5548eec2-33be-42b2-9b84-572236f095db-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.492935 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnwdd\" (UniqueName: \"kubernetes.io/projected/5548eec2-33be-42b2-9b84-572236f095db-kube-api-access-tnwdd\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.503576 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.506484 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5548eec2-33be-42b2-9b84-572236f095db-secrets\") pod \"openstack-galera-0\" (UID: \"5548eec2-33be-42b2-9b84-572236f095db\") " pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.571009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b4b3176-3882-486d-8217-54f429906f49-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.571384 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4b3176-3882-486d-8217-54f429906f49-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.571436 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4b4b3176-3882-486d-8217-54f429906f49-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.571465 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4b4b3176-3882-486d-8217-54f429906f49-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.571501 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b4b3176-3882-486d-8217-54f429906f49-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.571529 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4b3176-3882-486d-8217-54f429906f49-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.571583 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4b4b3176-3882-486d-8217-54f429906f49-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.571618 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.571689 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgr8l\" (UniqueName: \"kubernetes.io/projected/4b4b3176-3882-486d-8217-54f429906f49-kube-api-access-wgr8l\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.572618 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b4b3176-3882-486d-8217-54f429906f49-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.573964 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.578922 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4b4b3176-3882-486d-8217-54f429906f49-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.579003 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4b3176-3882-486d-8217-54f429906f49-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.583134 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b4b3176-3882-486d-8217-54f429906f49-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.590940 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.592844 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.599139 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4b4b3176-3882-486d-8217-54f429906f49-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.599349 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.599451 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-tgmxg" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.599886 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.607427 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.607475 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4b4b3176-3882-486d-8217-54f429906f49-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.611334 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4b3176-3882-486d-8217-54f429906f49-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.613341 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.644436 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgr8l\" (UniqueName: \"kubernetes.io/projected/4b4b3176-3882-486d-8217-54f429906f49-kube-api-access-wgr8l\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.657889 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4b4b3176-3882-486d-8217-54f429906f49\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.673214 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-config-data\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.673376 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-kolla-config\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.673407 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.673583 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.674044 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j2hs\" (UniqueName: \"kubernetes.io/projected/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-kube-api-access-9j2hs\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.735436 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.775965 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-kolla-config\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.776029 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.776072 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.776154 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j2hs\" (UniqueName: \"kubernetes.io/projected/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-kube-api-access-9j2hs\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.776190 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-config-data\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.777406 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-kolla-config\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.779403 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-config-data\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.781077 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.780771 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:00 crc kubenswrapper[4772]: I0930 17:19:00.799791 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j2hs\" (UniqueName: \"kubernetes.io/projected/d0056c55-0e0c-4dc0-8739-4a6e05db35ea-kube-api-access-9j2hs\") pod \"memcached-0\" (UID: \"d0056c55-0e0c-4dc0-8739-4a6e05db35ea\") " pod="openstack/memcached-0" Sep 30 17:19:01 crc kubenswrapper[4772]: I0930 17:19:01.000143 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 17:19:02 crc kubenswrapper[4772]: I0930 17:19:02.430950 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:19:02 crc kubenswrapper[4772]: I0930 17:19:02.432583 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 17:19:02 crc kubenswrapper[4772]: I0930 17:19:02.439783 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-tnj9h" Sep 30 17:19:02 crc kubenswrapper[4772]: I0930 17:19:02.503727 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgxbw\" (UniqueName: \"kubernetes.io/projected/f7de151f-3a4a-46c0-ae33-74cb5da8b13a-kube-api-access-hgxbw\") pod \"kube-state-metrics-0\" (UID: \"f7de151f-3a4a-46c0-ae33-74cb5da8b13a\") " pod="openstack/kube-state-metrics-0" Sep 30 17:19:02 crc kubenswrapper[4772]: I0930 17:19:02.513689 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:19:02 crc kubenswrapper[4772]: I0930 17:19:02.606178 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgxbw\" (UniqueName: \"kubernetes.io/projected/f7de151f-3a4a-46c0-ae33-74cb5da8b13a-kube-api-access-hgxbw\") pod \"kube-state-metrics-0\" (UID: \"f7de151f-3a4a-46c0-ae33-74cb5da8b13a\") " pod="openstack/kube-state-metrics-0" Sep 30 17:19:02 crc kubenswrapper[4772]: I0930 17:19:02.630261 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgxbw\" (UniqueName: \"kubernetes.io/projected/f7de151f-3a4a-46c0-ae33-74cb5da8b13a-kube-api-access-hgxbw\") pod \"kube-state-metrics-0\" (UID: \"f7de151f-3a4a-46c0-ae33-74cb5da8b13a\") " pod="openstack/kube-state-metrics-0" Sep 30 17:19:02 crc kubenswrapper[4772]: I0930 17:19:02.764327 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.629710 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.631802 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.644093 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-phjm5" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.644250 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.644470 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.648619 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.650621 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.657247 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.658720 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.725018 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.725169 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.725211 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rdk\" (UniqueName: \"kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-kube-api-access-s9rdk\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.725245 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.725272 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3d947ffa-5613-4aae-b4a9-d42094fad0ae-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.725291 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.725306 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.725321 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.826911 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.827208 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9rdk\" (UniqueName: \"kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-kube-api-access-s9rdk\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.827249 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.827275 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3d947ffa-5613-4aae-b4a9-d42094fad0ae-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.827296 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.827313 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.827330 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.827370 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.828275 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3d947ffa-5613-4aae-b4a9-d42094fad0ae-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.832898 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.832898 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.833984 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.834922 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.835390 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.835422 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a4cd7d25308c8c5d6d110405c655d59b160fe777a0b1c5faa198b785c403f1cc/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.835478 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.861012 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9rdk\" (UniqueName: \"kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-kube-api-access-s9rdk\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.876978 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:03 crc kubenswrapper[4772]: I0930 17:19:03.959973 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.351559 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6v6fm"] Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.353281 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.355911 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.356182 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.356181 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ck2z2" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.364373 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-t5kwk"] Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.367044 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.371433 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6v6fm"] Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.386320 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-t5kwk"] Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456235 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-etc-ovs\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456309 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-scripts\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456328 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-var-log-ovn\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456377 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-var-run\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456402 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-var-run\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456488 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-var-log\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456526 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-ovn-controller-tls-certs\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456569 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98k86\" (UniqueName: \"kubernetes.io/projected/e052869f-fd26-497b-9573-0ee6221fa96c-kube-api-access-98k86\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456605 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-combined-ca-bundle\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456627 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-var-lib\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456648 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-var-run-ovn\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456667 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqxt\" (UniqueName: \"kubernetes.io/projected/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-kube-api-access-gfqxt\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.456688 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e052869f-fd26-497b-9573-0ee6221fa96c-scripts\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558466 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-scripts\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558521 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-var-log-ovn\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558544 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-var-run\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558571 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-var-run\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-var-log\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558627 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-ovn-controller-tls-certs\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558657 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98k86\" (UniqueName: \"kubernetes.io/projected/e052869f-fd26-497b-9573-0ee6221fa96c-kube-api-access-98k86\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558686 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-combined-ca-bundle\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558709 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-var-lib\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558725 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-var-run-ovn\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558740 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqxt\" (UniqueName: \"kubernetes.io/projected/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-kube-api-access-gfqxt\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558756 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e052869f-fd26-497b-9573-0ee6221fa96c-scripts\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.558776 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-etc-ovs\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.559423 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-etc-ovs\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.559888 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-var-log-ovn\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.560021 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-var-run\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.560031 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-var-run-ovn\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.560308 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-var-lib\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.560359 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-var-run\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.560705 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e052869f-fd26-497b-9573-0ee6221fa96c-var-log\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.562376 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-scripts\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.562914 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e052869f-fd26-497b-9573-0ee6221fa96c-scripts\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.564240 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-combined-ca-bundle\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.583712 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqxt\" (UniqueName: \"kubernetes.io/projected/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-kube-api-access-gfqxt\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.585484 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98k86\" (UniqueName: \"kubernetes.io/projected/e052869f-fd26-497b-9573-0ee6221fa96c-kube-api-access-98k86\") pod \"ovn-controller-ovs-t5kwk\" (UID: \"e052869f-fd26-497b-9573-0ee6221fa96c\") " pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.588219 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66affdf-221c-4a29-a1f7-0c3d7e4d4153-ovn-controller-tls-certs\") pod \"ovn-controller-6v6fm\" (UID: \"d66affdf-221c-4a29-a1f7-0c3d7e4d4153\") " pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.727288 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:05 crc kubenswrapper[4772]: I0930 17:19:05.727300 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.504833 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.506912 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.510372 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.510595 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.510918 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.511071 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.520891 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fmz68" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.573033 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.607169 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fs9c\" (UniqueName: \"kubernetes.io/projected/99ec9fea-a439-415b-ac73-3c4d0242eeb3-kube-api-access-6fs9c\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.607255 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.607298 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99ec9fea-a439-415b-ac73-3c4d0242eeb3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.607325 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99ec9fea-a439-415b-ac73-3c4d0242eeb3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.607419 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ec9fea-a439-415b-ac73-3c4d0242eeb3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.607507 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99ec9fea-a439-415b-ac73-3c4d0242eeb3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.607530 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ec9fea-a439-415b-ac73-3c4d0242eeb3-config\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.607611 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99ec9fea-a439-415b-ac73-3c4d0242eeb3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.709040 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99ec9fea-a439-415b-ac73-3c4d0242eeb3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.709107 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ec9fea-a439-415b-ac73-3c4d0242eeb3-config\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.709147 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99ec9fea-a439-415b-ac73-3c4d0242eeb3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.709207 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fs9c\" (UniqueName: \"kubernetes.io/projected/99ec9fea-a439-415b-ac73-3c4d0242eeb3-kube-api-access-6fs9c\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.709228 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.709247 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99ec9fea-a439-415b-ac73-3c4d0242eeb3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.709264 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99ec9fea-a439-415b-ac73-3c4d0242eeb3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.709305 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ec9fea-a439-415b-ac73-3c4d0242eeb3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.710691 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.711003 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99ec9fea-a439-415b-ac73-3c4d0242eeb3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.711474 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99ec9fea-a439-415b-ac73-3c4d0242eeb3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.713397 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ec9fea-a439-415b-ac73-3c4d0242eeb3-config\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.716726 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99ec9fea-a439-415b-ac73-3c4d0242eeb3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.728261 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99ec9fea-a439-415b-ac73-3c4d0242eeb3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.729248 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ec9fea-a439-415b-ac73-3c4d0242eeb3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.731633 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fs9c\" (UniqueName: \"kubernetes.io/projected/99ec9fea-a439-415b-ac73-3c4d0242eeb3-kube-api-access-6fs9c\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.735400 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"99ec9fea-a439-415b-ac73-3c4d0242eeb3\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:07 crc kubenswrapper[4772]: I0930 17:19:07.823444 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.036199 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.038531 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.040930 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jm86f" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.041571 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.044926 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.045532 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.045542 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.145996 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da545add-e15e-4ed4-b084-66691b57284b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.146087 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da545add-e15e-4ed4-b084-66691b57284b-config\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.146131 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da545add-e15e-4ed4-b084-66691b57284b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.146161 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.146200 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da545add-e15e-4ed4-b084-66691b57284b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.146235 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv6sj\" (UniqueName: \"kubernetes.io/projected/da545add-e15e-4ed4-b084-66691b57284b-kube-api-access-sv6sj\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.146261 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da545add-e15e-4ed4-b084-66691b57284b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.146307 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/da545add-e15e-4ed4-b084-66691b57284b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.247667 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv6sj\" (UniqueName: \"kubernetes.io/projected/da545add-e15e-4ed4-b084-66691b57284b-kube-api-access-sv6sj\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.247712 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da545add-e15e-4ed4-b084-66691b57284b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.247763 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/da545add-e15e-4ed4-b084-66691b57284b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.247786 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da545add-e15e-4ed4-b084-66691b57284b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.247818 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da545add-e15e-4ed4-b084-66691b57284b-config\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.247844 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da545add-e15e-4ed4-b084-66691b57284b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.248268 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/da545add-e15e-4ed4-b084-66691b57284b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.248327 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.249134 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da545add-e15e-4ed4-b084-66691b57284b-config\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.249338 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da545add-e15e-4ed4-b084-66691b57284b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.249460 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da545add-e15e-4ed4-b084-66691b57284b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.251326 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.253531 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da545add-e15e-4ed4-b084-66691b57284b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.253684 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da545add-e15e-4ed4-b084-66691b57284b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.263148 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da545add-e15e-4ed4-b084-66691b57284b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.264599 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv6sj\" (UniqueName: \"kubernetes.io/projected/da545add-e15e-4ed4-b084-66691b57284b-kube-api-access-sv6sj\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.283484 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"da545add-e15e-4ed4-b084-66691b57284b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:10 crc kubenswrapper[4772]: I0930 17:19:10.355584 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:13 crc kubenswrapper[4772]: E0930 17:19:13.874995 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 17:19:13 crc kubenswrapper[4772]: E0930 17:19:13.875481 4772 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 17:19:13 crc kubenswrapper[4772]: E0930 17:19:13.876260 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.129.56.221:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrvm7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6b4d96f67c-ffhdn_openstack(f6bb9cf8-6565-4414-b8db-da6be55dda45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:19:13 crc kubenswrapper[4772]: E0930 17:19:13.877463 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" podUID="f6bb9cf8-6565-4414-b8db-da6be55dda45" Sep 30 17:19:14 crc kubenswrapper[4772]: E0930 17:19:14.277829 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 17:19:14 crc kubenswrapper[4772]: E0930 17:19:14.278091 4772 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 17:19:14 crc kubenswrapper[4772]: E0930 17:19:14.278257 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.129.56.221:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf8h67chbdh56bh684h64h69h64fh5ddh67fh5f6h56bhch5c9h595h55ch95h99hcdh584h9dhb7h656h645h649h574h569hb9h5b4h5cdh666hc5q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mcqkf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-9c554dd7c-kbhsn_openstack(b107f703-007f-41fa-8b85-fabdaa5da089): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:19:14 crc kubenswrapper[4772]: E0930 17:19:14.279444 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" podUID="b107f703-007f-41fa-8b85-fabdaa5da089" Sep 30 17:19:14 crc kubenswrapper[4772]: E0930 17:19:14.646886 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 17:19:14 crc kubenswrapper[4772]: E0930 17:19:14.646936 4772 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 17:19:14 crc kubenswrapper[4772]: E0930 17:19:14.647037 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.129.56.221:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6jx69,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7978bb4dbf-4fnjk_openstack(11e271ca-2800-4cd9-872c-da0a0bab2298): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:19:14 crc kubenswrapper[4772]: E0930 17:19:14.648125 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" podUID="11e271ca-2800-4cd9-872c-da0a0bab2298" Sep 30 17:19:14 crc kubenswrapper[4772]: E0930 17:19:14.990717 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.221:5001/podified-master-centos10/openstack-neutron-server:watcher_latest\\\"\"" pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" podUID="b107f703-007f-41fa-8b85-fabdaa5da089" Sep 30 17:19:15 crc kubenswrapper[4772]: I0930 17:19:15.979623 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.007460 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" event={"ID":"11e271ca-2800-4cd9-872c-da0a0bab2298","Type":"ContainerDied","Data":"cf3f4e3d5358f08b820a4bb1bad689bbfc10d2774020f1d16764aff8527d6692"} Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.007568 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf3f4e3d5358f08b820a4bb1bad689bbfc10d2774020f1d16764aff8527d6692" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.010429 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" event={"ID":"f6bb9cf8-6565-4414-b8db-da6be55dda45","Type":"ContainerDied","Data":"f81cfc5c0cc9bf318b4a96fdb68fc31b0341949edac1455ecb86af5229013e4b"} Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.010729 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4d96f67c-ffhdn" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.105844 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.141829 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrvm7\" (UniqueName: \"kubernetes.io/projected/f6bb9cf8-6565-4414-b8db-da6be55dda45-kube-api-access-wrvm7\") pod \"f6bb9cf8-6565-4414-b8db-da6be55dda45\" (UID: \"f6bb9cf8-6565-4414-b8db-da6be55dda45\") " Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.142269 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bb9cf8-6565-4414-b8db-da6be55dda45-config\") pod \"f6bb9cf8-6565-4414-b8db-da6be55dda45\" (UID: \"f6bb9cf8-6565-4414-b8db-da6be55dda45\") " Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.143223 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bb9cf8-6565-4414-b8db-da6be55dda45-config" (OuterVolumeSpecName: "config") pod "f6bb9cf8-6565-4414-b8db-da6be55dda45" (UID: "f6bb9cf8-6565-4414-b8db-da6be55dda45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.202142 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.207406 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.221259 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bb9cf8-6565-4414-b8db-da6be55dda45-kube-api-access-wrvm7" (OuterVolumeSpecName: "kube-api-access-wrvm7") pod "f6bb9cf8-6565-4414-b8db-da6be55dda45" (UID: "f6bb9cf8-6565-4414-b8db-da6be55dda45"). InnerVolumeSpecName "kube-api-access-wrvm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:19:16 crc kubenswrapper[4772]: W0930 17:19:16.224179 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0056c55_0e0c_4dc0_8739_4a6e05db35ea.slice/crio-d1f48b81b8077fe56588992ec3d24d878a51f3ddf5d6d02cfb759b18507c0af6 WatchSource:0}: Error finding container d1f48b81b8077fe56588992ec3d24d878a51f3ddf5d6d02cfb759b18507c0af6: Status 404 returned error can't find the container with id d1f48b81b8077fe56588992ec3d24d878a51f3ddf5d6d02cfb759b18507c0af6 Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.243529 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-config\") pod \"11e271ca-2800-4cd9-872c-da0a0bab2298\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.243706 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jx69\" (UniqueName: \"kubernetes.io/projected/11e271ca-2800-4cd9-872c-da0a0bab2298-kube-api-access-6jx69\") pod \"11e271ca-2800-4cd9-872c-da0a0bab2298\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.243830 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-dns-svc\") pod \"11e271ca-2800-4cd9-872c-da0a0bab2298\" (UID: \"11e271ca-2800-4cd9-872c-da0a0bab2298\") " Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.244281 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bb9cf8-6565-4414-b8db-da6be55dda45-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.244304 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrvm7\" (UniqueName: \"kubernetes.io/projected/f6bb9cf8-6565-4414-b8db-da6be55dda45-kube-api-access-wrvm7\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.244627 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-config" (OuterVolumeSpecName: "config") pod "11e271ca-2800-4cd9-872c-da0a0bab2298" (UID: "11e271ca-2800-4cd9-872c-da0a0bab2298"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.244649 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11e271ca-2800-4cd9-872c-da0a0bab2298" (UID: "11e271ca-2800-4cd9-872c-da0a0bab2298"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.251782 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e271ca-2800-4cd9-872c-da0a0bab2298-kube-api-access-6jx69" (OuterVolumeSpecName: "kube-api-access-6jx69") pod "11e271ca-2800-4cd9-872c-da0a0bab2298" (UID: "11e271ca-2800-4cd9-872c-da0a0bab2298"). InnerVolumeSpecName "kube-api-access-6jx69". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.347439 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jx69\" (UniqueName: \"kubernetes.io/projected/11e271ca-2800-4cd9-872c-da0a0bab2298-kube-api-access-6jx69\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.347472 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.347485 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e271ca-2800-4cd9-872c-da0a0bab2298-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.484295 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b4d96f67c-ffhdn"] Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.489644 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b4d96f67c-ffhdn"] Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.633946 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.651820 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.659985 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6v6fm"] Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.700180 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:19:16 crc kubenswrapper[4772]: W0930 17:19:16.729282 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d947ffa_5613_4aae_b4a9_d42094fad0ae.slice/crio-0298d8ba26cdd55502c9bd6622adb0ee0b52c7f255c4f2a6017ca15bab0ec9aa WatchSource:0}: Error finding container 0298d8ba26cdd55502c9bd6622adb0ee0b52c7f255c4f2a6017ca15bab0ec9aa: Status 404 returned error can't find the container with id 0298d8ba26cdd55502c9bd6622adb0ee0b52c7f255c4f2a6017ca15bab0ec9aa Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.796475 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 17:19:16 crc kubenswrapper[4772]: I0930 17:19:16.886235 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-t5kwk"] Sep 30 17:19:16 crc kubenswrapper[4772]: W0930 17:19:16.893546 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode052869f_fd26_497b_9573_0ee6221fa96c.slice/crio-01b3dee902408dfca772c5c5a37294c8c312c00cf5d2485bda905f72d307008c WatchSource:0}: Error finding container 01b3dee902408dfca772c5c5a37294c8c312c00cf5d2485bda905f72d307008c: Status 404 returned error can't find the container with id 01b3dee902408dfca772c5c5a37294c8c312c00cf5d2485bda905f72d307008c Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.036809 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"607217cf-8f90-4adb-bca7-0271ea8a7b9b","Type":"ContainerStarted","Data":"1ad1ee84c662a306822379ff76ab47a3d264f59ac645289fc7b612396971bbe8"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.039965 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4b4b3176-3882-486d-8217-54f429906f49","Type":"ContainerStarted","Data":"238c0d414f615afc5183a7b6d9dda05c8cddc6c43a467fab0cbff3c71b6205be"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.045046 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d0056c55-0e0c-4dc0-8739-4a6e05db35ea","Type":"ContainerStarted","Data":"d1f48b81b8077fe56588992ec3d24d878a51f3ddf5d6d02cfb759b18507c0af6"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.046948 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3d947ffa-5613-4aae-b4a9-d42094fad0ae","Type":"ContainerStarted","Data":"0298d8ba26cdd55502c9bd6622adb0ee0b52c7f255c4f2a6017ca15bab0ec9aa"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.056922 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f7de151f-3a4a-46c0-ae33-74cb5da8b13a","Type":"ContainerStarted","Data":"a83d8c9bcfd68fa37bb64636e67bd35da6baf110424298177271fa142918bfb5"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.073143 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5548eec2-33be-42b2-9b84-572236f095db","Type":"ContainerStarted","Data":"f52c6bb3a94562a838aff3ba341686a86bb2f88a25567c363c22c564a81fe64d"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.077343 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c0788e86-24b4-421d-98c9-12f0a8e52740","Type":"ContainerStarted","Data":"901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.081229 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t5kwk" event={"ID":"e052869f-fd26-497b-9573-0ee6221fa96c","Type":"ContainerStarted","Data":"01b3dee902408dfca772c5c5a37294c8c312c00cf5d2485bda905f72d307008c"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.089511 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-fkkwr"] Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.091106 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.094105 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.094271 4772 generic.go:334] "Generic (PLEG): container finished" podID="1cec6569-cbbf-433a-ac10-c314faf1f80f" containerID="36027adb2af5bd2c90cef3c1bb7d9981903960971db459ece0ab182ce221c8be" exitCode=0 Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.094309 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d46b9689-d9snl" event={"ID":"1cec6569-cbbf-433a-ac10-c314faf1f80f","Type":"ContainerDied","Data":"36027adb2af5bd2c90cef3c1bb7d9981903960971db459ece0ab182ce221c8be"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.102454 4772 generic.go:334] "Generic (PLEG): container finished" podID="a1ccfbe9-8a91-4864-abbd-876484b84d92" containerID="106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21" exitCode=0 Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.102766 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" event={"ID":"a1ccfbe9-8a91-4864-abbd-876484b84d92","Type":"ContainerDied","Data":"106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.121759 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6v6fm" event={"ID":"d66affdf-221c-4a29-a1f7-0c3d7e4d4153","Type":"ContainerStarted","Data":"a2b7acec5b75dfdf51c396383fdeddebb9cf09080c9f880ff216309dc6025e51"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.121821 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fkkwr"] Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.140342 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"99ec9fea-a439-415b-ac73-3c4d0242eeb3","Type":"ContainerStarted","Data":"a874ae1d6c8dc3e83d5a24a0d25d7e835c4b0b292493bc20dba6a4e6b130ea74"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.154544 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7978bb4dbf-4fnjk" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.155686 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9e90f254-e3e7-4c4f-acfe-1a251e7682df","Type":"ContainerStarted","Data":"86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987"} Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.263227 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9c554dd7c-kbhsn"] Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.281040 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-ovn-rundir\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.281180 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-config\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.281210 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-ovs-rundir\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.281294 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w2t6\" (UniqueName: \"kubernetes.io/projected/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-kube-api-access-2w2t6\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.281388 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.281414 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-combined-ca-bundle\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.289676 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b475b89f-hxnxc"] Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.296911 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.302520 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.329241 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b475b89f-hxnxc"] Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.365995 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7978bb4dbf-4fnjk"] Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.372854 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7978bb4dbf-4fnjk"] Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.384841 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-ovn-rundir\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.384916 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-config\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.384933 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-ovs-rundir\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.384971 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w2t6\" (UniqueName: \"kubernetes.io/projected/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-kube-api-access-2w2t6\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.385000 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27b2h\" (UniqueName: \"kubernetes.io/projected/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-kube-api-access-27b2h\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.385025 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-ovsdbserver-nb\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.385046 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-dns-svc\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.385091 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-config\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.385116 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.385136 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-combined-ca-bundle\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.386309 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-ovn-rundir\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.387168 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-ovs-rundir\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.387192 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-config\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.393834 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-combined-ca-bundle\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.394562 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.420647 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w2t6\" (UniqueName: \"kubernetes.io/projected/43af7d7d-ee79-4c8c-b4fd-6789a382bab3-kube-api-access-2w2t6\") pod \"ovn-controller-metrics-fkkwr\" (UID: \"43af7d7d-ee79-4c8c-b4fd-6789a382bab3\") " pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.427437 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.433681 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fkkwr" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.487271 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27b2h\" (UniqueName: \"kubernetes.io/projected/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-kube-api-access-27b2h\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.487323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-ovsdbserver-nb\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.487352 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-dns-svc\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.487374 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-config\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.488632 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-ovsdbserver-nb\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.489651 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-dns-svc\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.491335 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-config\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.510912 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27b2h\" (UniqueName: \"kubernetes.io/projected/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-kube-api-access-27b2h\") pod \"dnsmasq-dns-76b475b89f-hxnxc\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.636044 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.833424 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77fb6b9747-wl7s4"] Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.859454 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-556574fbcf-gsxp7"] Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.868564 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.880774 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.954394 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e271ca-2800-4cd9-872c-da0a0bab2298" path="/var/lib/kubelet/pods/11e271ca-2800-4cd9-872c-da0a0bab2298/volumes" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.954775 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6bb9cf8-6565-4414-b8db-da6be55dda45" path="/var/lib/kubelet/pods/f6bb9cf8-6565-4414-b8db-da6be55dda45/volumes" Sep 30 17:19:17 crc kubenswrapper[4772]: I0930 17:19:17.955122 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-556574fbcf-gsxp7"] Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.035226 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq7td\" (UniqueName: \"kubernetes.io/projected/eb1c02f4-3278-47ea-8958-945b14fe2868-kube-api-access-lq7td\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.035305 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-config\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.035347 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-dns-svc\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.035378 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-sb\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.035447 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-nb\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.139789 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-nb\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.139949 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq7td\" (UniqueName: \"kubernetes.io/projected/eb1c02f4-3278-47ea-8958-945b14fe2868-kube-api-access-lq7td\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.139997 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-config\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.140067 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-dns-svc\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.140110 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-sb\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.141109 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-config\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.141346 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-sb\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.142001 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-nb\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.142094 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-dns-svc\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.172912 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq7td\" (UniqueName: \"kubernetes.io/projected/eb1c02f4-3278-47ea-8958-945b14fe2868-kube-api-access-lq7td\") pod \"dnsmasq-dns-556574fbcf-gsxp7\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:18 crc kubenswrapper[4772]: I0930 17:19:18.248747 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:19 crc kubenswrapper[4772]: W0930 17:19:19.394432 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda545add_e15e_4ed4_b084_66691b57284b.slice/crio-9a987859a867217079900d356a90a6c880297b4ec7cede49ca83e14725b685a8 WatchSource:0}: Error finding container 9a987859a867217079900d356a90a6c880297b4ec7cede49ca83e14725b685a8: Status 404 returned error can't find the container with id 9a987859a867217079900d356a90a6c880297b4ec7cede49ca83e14725b685a8 Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.478862 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.483377 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.575963 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-config\") pod \"b107f703-007f-41fa-8b85-fabdaa5da089\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.576044 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-config\") pod \"1cec6569-cbbf-433a-ac10-c314faf1f80f\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.576228 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcqkf\" (UniqueName: \"kubernetes.io/projected/b107f703-007f-41fa-8b85-fabdaa5da089-kube-api-access-mcqkf\") pod \"b107f703-007f-41fa-8b85-fabdaa5da089\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.576291 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hthx\" (UniqueName: \"kubernetes.io/projected/1cec6569-cbbf-433a-ac10-c314faf1f80f-kube-api-access-7hthx\") pod \"1cec6569-cbbf-433a-ac10-c314faf1f80f\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.576349 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-dns-svc\") pod \"b107f703-007f-41fa-8b85-fabdaa5da089\" (UID: \"b107f703-007f-41fa-8b85-fabdaa5da089\") " Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.576391 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-dns-svc\") pod \"1cec6569-cbbf-433a-ac10-c314faf1f80f\" (UID: \"1cec6569-cbbf-433a-ac10-c314faf1f80f\") " Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.577001 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b107f703-007f-41fa-8b85-fabdaa5da089" (UID: "b107f703-007f-41fa-8b85-fabdaa5da089"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.577048 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-config" (OuterVolumeSpecName: "config") pod "b107f703-007f-41fa-8b85-fabdaa5da089" (UID: "b107f703-007f-41fa-8b85-fabdaa5da089"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.585557 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cec6569-cbbf-433a-ac10-c314faf1f80f-kube-api-access-7hthx" (OuterVolumeSpecName: "kube-api-access-7hthx") pod "1cec6569-cbbf-433a-ac10-c314faf1f80f" (UID: "1cec6569-cbbf-433a-ac10-c314faf1f80f"). InnerVolumeSpecName "kube-api-access-7hthx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.596089 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b107f703-007f-41fa-8b85-fabdaa5da089-kube-api-access-mcqkf" (OuterVolumeSpecName: "kube-api-access-mcqkf") pod "b107f703-007f-41fa-8b85-fabdaa5da089" (UID: "b107f703-007f-41fa-8b85-fabdaa5da089"). InnerVolumeSpecName "kube-api-access-mcqkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.597037 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-config" (OuterVolumeSpecName: "config") pod "1cec6569-cbbf-433a-ac10-c314faf1f80f" (UID: "1cec6569-cbbf-433a-ac10-c314faf1f80f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.599920 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1cec6569-cbbf-433a-ac10-c314faf1f80f" (UID: "1cec6569-cbbf-433a-ac10-c314faf1f80f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.678474 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcqkf\" (UniqueName: \"kubernetes.io/projected/b107f703-007f-41fa-8b85-fabdaa5da089-kube-api-access-mcqkf\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.678524 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hthx\" (UniqueName: \"kubernetes.io/projected/1cec6569-cbbf-433a-ac10-c314faf1f80f-kube-api-access-7hthx\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.678535 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.678547 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.678556 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b107f703-007f-41fa-8b85-fabdaa5da089-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:19 crc kubenswrapper[4772]: I0930 17:19:19.678568 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cec6569-cbbf-433a-ac10-c314faf1f80f-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:20 crc kubenswrapper[4772]: I0930 17:19:20.279287 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"da545add-e15e-4ed4-b084-66691b57284b","Type":"ContainerStarted","Data":"9a987859a867217079900d356a90a6c880297b4ec7cede49ca83e14725b685a8"} Sep 30 17:19:20 crc kubenswrapper[4772]: I0930 17:19:20.280656 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" Sep 30 17:19:20 crc kubenswrapper[4772]: I0930 17:19:20.280722 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c554dd7c-kbhsn" event={"ID":"b107f703-007f-41fa-8b85-fabdaa5da089","Type":"ContainerDied","Data":"8b5cbc5686a9b686fdede061ab39dbd826f67800831a2f42dd0df6fd4f47b28b"} Sep 30 17:19:20 crc kubenswrapper[4772]: I0930 17:19:20.282301 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d46b9689-d9snl" event={"ID":"1cec6569-cbbf-433a-ac10-c314faf1f80f","Type":"ContainerDied","Data":"e505a04d65dc76982ea4c1cb2b290a296a25bcc049522025f1928b38218b074f"} Sep 30 17:19:20 crc kubenswrapper[4772]: I0930 17:19:20.282335 4772 scope.go:117] "RemoveContainer" containerID="36027adb2af5bd2c90cef3c1bb7d9981903960971db459ece0ab182ce221c8be" Sep 30 17:19:20 crc kubenswrapper[4772]: I0930 17:19:20.282408 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d46b9689-d9snl" Sep 30 17:19:20 crc kubenswrapper[4772]: I0930 17:19:20.348921 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79d46b9689-d9snl"] Sep 30 17:19:20 crc kubenswrapper[4772]: I0930 17:19:20.355006 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79d46b9689-d9snl"] Sep 30 17:19:20 crc kubenswrapper[4772]: I0930 17:19:20.377685 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9c554dd7c-kbhsn"] Sep 30 17:19:20 crc kubenswrapper[4772]: I0930 17:19:20.384318 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9c554dd7c-kbhsn"] Sep 30 17:19:21 crc kubenswrapper[4772]: I0930 17:19:21.909114 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cec6569-cbbf-433a-ac10-c314faf1f80f" path="/var/lib/kubelet/pods/1cec6569-cbbf-433a-ac10-c314faf1f80f/volumes" Sep 30 17:19:21 crc kubenswrapper[4772]: I0930 17:19:21.909697 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b107f703-007f-41fa-8b85-fabdaa5da089" path="/var/lib/kubelet/pods/b107f703-007f-41fa-8b85-fabdaa5da089/volumes" Sep 30 17:19:24 crc kubenswrapper[4772]: I0930 17:19:24.745545 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b475b89f-hxnxc"] Sep 30 17:19:25 crc kubenswrapper[4772]: I0930 17:19:25.048411 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-556574fbcf-gsxp7"] Sep 30 17:19:25 crc kubenswrapper[4772]: I0930 17:19:25.120931 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fkkwr"] Sep 30 17:19:25 crc kubenswrapper[4772]: I0930 17:19:25.391155 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" event={"ID":"eb1c02f4-3278-47ea-8958-945b14fe2868","Type":"ContainerStarted","Data":"e3f42dcdeb8a994de7bb4909ce1b3dac2f5369fcb37328d6fc6e20776beef443"} Sep 30 17:19:25 crc kubenswrapper[4772]: I0930 17:19:25.392344 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" event={"ID":"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8","Type":"ContainerStarted","Data":"0318b1ae1c6110c68dc0f530b717548920b18e2cef75e1affe6df0700d8978e4"} Sep 30 17:19:25 crc kubenswrapper[4772]: I0930 17:19:25.393722 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fkkwr" event={"ID":"43af7d7d-ee79-4c8c-b4fd-6789a382bab3","Type":"ContainerStarted","Data":"376ba81cf3c1aab353984415bbc6518310a8518d6e8deeb8b82fe11333f40764"} Sep 30 17:19:26 crc kubenswrapper[4772]: I0930 17:19:26.426854 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" event={"ID":"a1ccfbe9-8a91-4864-abbd-876484b84d92","Type":"ContainerStarted","Data":"b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1"} Sep 30 17:19:26 crc kubenswrapper[4772]: I0930 17:19:26.427229 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" podUID="a1ccfbe9-8a91-4864-abbd-876484b84d92" containerName="dnsmasq-dns" containerID="cri-o://b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1" gracePeriod=10 Sep 30 17:19:26 crc kubenswrapper[4772]: I0930 17:19:26.427429 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:19:26 crc kubenswrapper[4772]: I0930 17:19:26.432608 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d0056c55-0e0c-4dc0-8739-4a6e05db35ea","Type":"ContainerStarted","Data":"f33aaaf77bb2e61d1ec3a942401e019c7334fa95d842c8acd2f3acf9429e0576"} Sep 30 17:19:26 crc kubenswrapper[4772]: I0930 17:19:26.433441 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 17:19:26 crc kubenswrapper[4772]: I0930 17:19:26.455125 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" podStartSLOduration=12.487893639 podStartE2EDuration="30.455106052s" podCreationTimestamp="2025-09-30 17:18:56 +0000 UTC" firstStartedPulling="2025-09-30 17:18:57.728211126 +0000 UTC m=+1038.635223957" lastFinishedPulling="2025-09-30 17:19:15.695423539 +0000 UTC m=+1056.602436370" observedRunningTime="2025-09-30 17:19:26.445489286 +0000 UTC m=+1067.352502117" watchObservedRunningTime="2025-09-30 17:19:26.455106052 +0000 UTC m=+1067.362118883" Sep 30 17:19:26 crc kubenswrapper[4772]: I0930 17:19:26.466618 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.378634087000002 podStartE2EDuration="26.466600786s" podCreationTimestamp="2025-09-30 17:19:00 +0000 UTC" firstStartedPulling="2025-09-30 17:19:16.227416493 +0000 UTC m=+1057.134429324" lastFinishedPulling="2025-09-30 17:19:24.315383192 +0000 UTC m=+1065.222396023" observedRunningTime="2025-09-30 17:19:26.466246747 +0000 UTC m=+1067.373259588" watchObservedRunningTime="2025-09-30 17:19:26.466600786 +0000 UTC m=+1067.373613617" Sep 30 17:19:26 crc kubenswrapper[4772]: I0930 17:19:26.984207 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.124660 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-config\") pod \"a1ccfbe9-8a91-4864-abbd-876484b84d92\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.124745 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-dns-svc\") pod \"a1ccfbe9-8a91-4864-abbd-876484b84d92\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.124817 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7brqc\" (UniqueName: \"kubernetes.io/projected/a1ccfbe9-8a91-4864-abbd-876484b84d92-kube-api-access-7brqc\") pod \"a1ccfbe9-8a91-4864-abbd-876484b84d92\" (UID: \"a1ccfbe9-8a91-4864-abbd-876484b84d92\") " Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.130640 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ccfbe9-8a91-4864-abbd-876484b84d92-kube-api-access-7brqc" (OuterVolumeSpecName: "kube-api-access-7brqc") pod "a1ccfbe9-8a91-4864-abbd-876484b84d92" (UID: "a1ccfbe9-8a91-4864-abbd-876484b84d92"). InnerVolumeSpecName "kube-api-access-7brqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.172045 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1ccfbe9-8a91-4864-abbd-876484b84d92" (UID: "a1ccfbe9-8a91-4864-abbd-876484b84d92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.177371 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-config" (OuterVolumeSpecName: "config") pod "a1ccfbe9-8a91-4864-abbd-876484b84d92" (UID: "a1ccfbe9-8a91-4864-abbd-876484b84d92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.227094 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.227129 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1ccfbe9-8a91-4864-abbd-876484b84d92-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.227140 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7brqc\" (UniqueName: \"kubernetes.io/projected/a1ccfbe9-8a91-4864-abbd-876484b84d92-kube-api-access-7brqc\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.443583 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"99ec9fea-a439-415b-ac73-3c4d0242eeb3","Type":"ContainerStarted","Data":"e81b406fcb410f75b859d29c29632bc6b71d7942622c98be5ddebcb121c10613"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.445957 4772 generic.go:334] "Generic (PLEG): container finished" podID="e052869f-fd26-497b-9573-0ee6221fa96c" containerID="79055771eee3bc01c35a02230065e2cdcd6cf049818ec8008eb5935919d77cbd" exitCode=0 Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.446047 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t5kwk" event={"ID":"e052869f-fd26-497b-9573-0ee6221fa96c","Type":"ContainerDied","Data":"79055771eee3bc01c35a02230065e2cdcd6cf049818ec8008eb5935919d77cbd"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.448270 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6v6fm" event={"ID":"d66affdf-221c-4a29-a1f7-0c3d7e4d4153","Type":"ContainerStarted","Data":"360b21b4347a4d7c3e5739e1bc32351238f69d98d672563ab033ad0970fee24d"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.448592 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6v6fm" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.451004 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4b4b3176-3882-486d-8217-54f429906f49","Type":"ContainerStarted","Data":"9e9f1d8a318aec4caddd9896c9cba05e05c48fe0ea1e03725eba11e4ddfde818"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.452442 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5548eec2-33be-42b2-9b84-572236f095db","Type":"ContainerStarted","Data":"cfff005e3c754ef69e43a6b33dbf1cbd7d11b678b00df59f21048c6bb4dd244f"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.457919 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f7de151f-3a4a-46c0-ae33-74cb5da8b13a","Type":"ContainerStarted","Data":"c3a6cb858479d78e84807d01e01b63af66640a5000e12b8f2245386e6d6b788b"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.459386 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.462152 4772 generic.go:334] "Generic (PLEG): container finished" podID="eb1c02f4-3278-47ea-8958-945b14fe2868" containerID="ff8328a751db53ac5bb7008f5528622bb6fd64c859175bdaf3153b2197395f1a" exitCode=0 Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.462223 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" event={"ID":"eb1c02f4-3278-47ea-8958-945b14fe2868","Type":"ContainerDied","Data":"ff8328a751db53ac5bb7008f5528622bb6fd64c859175bdaf3153b2197395f1a"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.468871 4772 generic.go:334] "Generic (PLEG): container finished" podID="a1ccfbe9-8a91-4864-abbd-876484b84d92" containerID="b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1" exitCode=0 Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.469031 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" event={"ID":"a1ccfbe9-8a91-4864-abbd-876484b84d92","Type":"ContainerDied","Data":"b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.469090 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" event={"ID":"a1ccfbe9-8a91-4864-abbd-876484b84d92","Type":"ContainerDied","Data":"9be6eff1e1cb7c6bbbae679ad683a4fa35b7570ddafdbbfbd7e94e6fee8bafdf"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.469114 4772 scope.go:117] "RemoveContainer" containerID="b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.469272 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fb6b9747-wl7s4" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.482091 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"da545add-e15e-4ed4-b084-66691b57284b","Type":"ContainerStarted","Data":"b6165dbd88023fde4c43da1ded7202e2b85cd570b4a93168c556c028eda8c7d9"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.486945 4772 generic.go:334] "Generic (PLEG): container finished" podID="aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" containerID="cc709c2547b5c4831e309f72842cc5b1337ec10c73814e9e06a9fb42db48b63f" exitCode=0 Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.487118 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" event={"ID":"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8","Type":"ContainerDied","Data":"cc709c2547b5c4831e309f72842cc5b1337ec10c73814e9e06a9fb42db48b63f"} Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.524234 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6v6fm" podStartSLOduration=14.578686464 podStartE2EDuration="22.524010177s" podCreationTimestamp="2025-09-30 17:19:05 +0000 UTC" firstStartedPulling="2025-09-30 17:19:16.693623477 +0000 UTC m=+1057.600636308" lastFinishedPulling="2025-09-30 17:19:24.63894718 +0000 UTC m=+1065.545960021" observedRunningTime="2025-09-30 17:19:27.485793271 +0000 UTC m=+1068.392806102" watchObservedRunningTime="2025-09-30 17:19:27.524010177 +0000 UTC m=+1068.431023008" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.529421 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.939543665 podStartE2EDuration="25.529407225s" podCreationTimestamp="2025-09-30 17:19:02 +0000 UTC" firstStartedPulling="2025-09-30 17:19:16.68200094 +0000 UTC m=+1057.589013781" lastFinishedPulling="2025-09-30 17:19:26.27186451 +0000 UTC m=+1067.178877341" observedRunningTime="2025-09-30 17:19:27.499036699 +0000 UTC m=+1068.406049530" watchObservedRunningTime="2025-09-30 17:19:27.529407225 +0000 UTC m=+1068.436420056" Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.622996 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77fb6b9747-wl7s4"] Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.633638 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77fb6b9747-wl7s4"] Sep 30 17:19:27 crc kubenswrapper[4772]: I0930 17:19:27.912254 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ccfbe9-8a91-4864-abbd-876484b84d92" path="/var/lib/kubelet/pods/a1ccfbe9-8a91-4864-abbd-876484b84d92/volumes" Sep 30 17:19:29 crc kubenswrapper[4772]: I0930 17:19:29.471457 4772 scope.go:117] "RemoveContainer" containerID="106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21" Sep 30 17:19:29 crc kubenswrapper[4772]: I0930 17:19:29.512378 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3d947ffa-5613-4aae-b4a9-d42094fad0ae","Type":"ContainerStarted","Data":"880847e146733e46ee15523008c4cce1978586e9b6c0a83798cbe8f338b9a176"} Sep 30 17:19:29 crc kubenswrapper[4772]: I0930 17:19:29.599669 4772 scope.go:117] "RemoveContainer" containerID="b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1" Sep 30 17:19:29 crc kubenswrapper[4772]: E0930 17:19:29.601805 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1\": container with ID starting with b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1 not found: ID does not exist" containerID="b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1" Sep 30 17:19:29 crc kubenswrapper[4772]: I0930 17:19:29.601898 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1"} err="failed to get container status \"b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1\": rpc error: code = NotFound desc = could not find container \"b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1\": container with ID starting with b51fb2a78a6f8559c8b3df843cc724e9cf3f8a87da50c1dff5c4189dc7d719b1 not found: ID does not exist" Sep 30 17:19:29 crc kubenswrapper[4772]: I0930 17:19:29.601932 4772 scope.go:117] "RemoveContainer" containerID="106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21" Sep 30 17:19:29 crc kubenswrapper[4772]: E0930 17:19:29.603079 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21\": container with ID starting with 106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21 not found: ID does not exist" containerID="106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21" Sep 30 17:19:29 crc kubenswrapper[4772]: I0930 17:19:29.603145 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21"} err="failed to get container status \"106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21\": rpc error: code = NotFound desc = could not find container \"106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21\": container with ID starting with 106f6c22b7513dbc835f428c8ec79dd475b2e10e0cefd227b759587d26ffcf21 not found: ID does not exist" Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.528325 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"99ec9fea-a439-415b-ac73-3c4d0242eeb3","Type":"ContainerStarted","Data":"ccb61483e0e2488f4df4c0c55edd3c17abf9b8650dac6fee1d4269d1534dd042"} Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.531399 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" event={"ID":"eb1c02f4-3278-47ea-8958-945b14fe2868","Type":"ContainerStarted","Data":"07645a3578e6eb48a011080c809807704a17c3736accb5af852dc82610a513aa"} Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.531661 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.534311 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t5kwk" event={"ID":"e052869f-fd26-497b-9573-0ee6221fa96c","Type":"ContainerStarted","Data":"a0620daa81de9ab5ed7c250a047e795c914b323f1f7279c845812490c2dd97d7"} Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.534361 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t5kwk" event={"ID":"e052869f-fd26-497b-9573-0ee6221fa96c","Type":"ContainerStarted","Data":"3ad2ad6114c807af4de6a4d886588a58da9357661aa5a8caf15bb80fca801d34"} Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.534395 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.534491 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.538542 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"da545add-e15e-4ed4-b084-66691b57284b","Type":"ContainerStarted","Data":"314ecaf5bff56a1bb04d7260bf3cff6d6d6ba54e3e186452b7908ed4966916e5"} Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.540870 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" event={"ID":"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8","Type":"ContainerStarted","Data":"4308fcddf8779a541588e523658ba5e72b9933a764085f782798615764c3d532"} Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.540948 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.542847 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fkkwr" event={"ID":"43af7d7d-ee79-4c8c-b4fd-6789a382bab3","Type":"ContainerStarted","Data":"8dd8538e8b55c8f42b94e680ce9a7d72031840aef10646f83fecd0e8d37f2144"} Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.556370 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.721405071 podStartE2EDuration="24.556342516s" podCreationTimestamp="2025-09-30 17:19:06 +0000 UTC" firstStartedPulling="2025-09-30 17:19:16.808720748 +0000 UTC m=+1057.715733579" lastFinishedPulling="2025-09-30 17:19:29.643658193 +0000 UTC m=+1070.550671024" observedRunningTime="2025-09-30 17:19:30.544928014 +0000 UTC m=+1071.451940845" watchObservedRunningTime="2025-09-30 17:19:30.556342516 +0000 UTC m=+1071.463355347" Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.563741 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-fkkwr" podStartSLOduration=9.154033137 podStartE2EDuration="13.563722444s" podCreationTimestamp="2025-09-30 17:19:17 +0000 UTC" firstStartedPulling="2025-09-30 17:19:25.190024323 +0000 UTC m=+1066.097037144" lastFinishedPulling="2025-09-30 17:19:29.59971362 +0000 UTC m=+1070.506726451" observedRunningTime="2025-09-30 17:19:30.560782469 +0000 UTC m=+1071.467795310" watchObservedRunningTime="2025-09-30 17:19:30.563722444 +0000 UTC m=+1071.470735285" Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.602046 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.3694696 podStartE2EDuration="21.602025283s" podCreationTimestamp="2025-09-30 17:19:09 +0000 UTC" firstStartedPulling="2025-09-30 17:19:19.396693251 +0000 UTC m=+1060.303706082" lastFinishedPulling="2025-09-30 17:19:29.629248934 +0000 UTC m=+1070.536261765" observedRunningTime="2025-09-30 17:19:30.600023042 +0000 UTC m=+1071.507035883" watchObservedRunningTime="2025-09-30 17:19:30.602025283 +0000 UTC m=+1071.509038114" Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.630447 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" podStartSLOduration=13.630426589 podStartE2EDuration="13.630426589s" podCreationTimestamp="2025-09-30 17:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.619807647 +0000 UTC m=+1071.526820468" watchObservedRunningTime="2025-09-30 17:19:30.630426589 +0000 UTC m=+1071.537439420" Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.644224 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-t5kwk" podStartSLOduration=17.920614774 podStartE2EDuration="25.644205501s" podCreationTimestamp="2025-09-30 17:19:05 +0000 UTC" firstStartedPulling="2025-09-30 17:19:16.895970918 +0000 UTC m=+1057.802983749" lastFinishedPulling="2025-09-30 17:19:24.619561635 +0000 UTC m=+1065.526574476" observedRunningTime="2025-09-30 17:19:30.641627725 +0000 UTC m=+1071.548640576" watchObservedRunningTime="2025-09-30 17:19:30.644205501 +0000 UTC m=+1071.551218332" Sep 30 17:19:30 crc kubenswrapper[4772]: I0930 17:19:30.672752 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" podStartSLOduration=13.67273372 podStartE2EDuration="13.67273372s" podCreationTimestamp="2025-09-30 17:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.666623054 +0000 UTC m=+1071.573635895" watchObservedRunningTime="2025-09-30 17:19:30.67273372 +0000 UTC m=+1071.579746551" Sep 30 17:19:31 crc kubenswrapper[4772]: I0930 17:19:31.002415 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 17:19:31 crc kubenswrapper[4772]: I0930 17:19:31.356439 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:31 crc kubenswrapper[4772]: I0930 17:19:31.412118 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:31 crc kubenswrapper[4772]: I0930 17:19:31.551860 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:31 crc kubenswrapper[4772]: I0930 17:19:31.596414 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 17:19:31 crc kubenswrapper[4772]: I0930 17:19:31.824232 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:31 crc kubenswrapper[4772]: I0930 17:19:31.886584 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.558188 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.600327 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.788910 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.805276 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 17:19:32 crc kubenswrapper[4772]: E0930 17:19:32.807854 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ccfbe9-8a91-4864-abbd-876484b84d92" containerName="init" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.807880 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ccfbe9-8a91-4864-abbd-876484b84d92" containerName="init" Sep 30 17:19:32 crc kubenswrapper[4772]: E0930 17:19:32.807949 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ccfbe9-8a91-4864-abbd-876484b84d92" containerName="dnsmasq-dns" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.807958 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ccfbe9-8a91-4864-abbd-876484b84d92" containerName="dnsmasq-dns" Sep 30 17:19:32 crc kubenswrapper[4772]: E0930 17:19:32.807996 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cec6569-cbbf-433a-ac10-c314faf1f80f" containerName="init" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.808004 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cec6569-cbbf-433a-ac10-c314faf1f80f" containerName="init" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.808577 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ccfbe9-8a91-4864-abbd-876484b84d92" containerName="dnsmasq-dns" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.808636 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cec6569-cbbf-433a-ac10-c314faf1f80f" containerName="init" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.827527 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.832894 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.834370 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.834647 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-sb26v" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.835631 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.837283 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.958697 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10ca909a-0a73-4f62-89a4-ed8ffac99539-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.959299 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ca909a-0a73-4f62-89a4-ed8ffac99539-config\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.959363 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h2ld\" (UniqueName: \"kubernetes.io/projected/10ca909a-0a73-4f62-89a4-ed8ffac99539-kube-api-access-8h2ld\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.959393 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10ca909a-0a73-4f62-89a4-ed8ffac99539-scripts\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.959443 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ca909a-0a73-4f62-89a4-ed8ffac99539-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.959474 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ca909a-0a73-4f62-89a4-ed8ffac99539-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:32 crc kubenswrapper[4772]: I0930 17:19:32.959514 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ca909a-0a73-4f62-89a4-ed8ffac99539-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.060955 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10ca909a-0a73-4f62-89a4-ed8ffac99539-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.061348 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ca909a-0a73-4f62-89a4-ed8ffac99539-config\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.061484 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h2ld\" (UniqueName: \"kubernetes.io/projected/10ca909a-0a73-4f62-89a4-ed8ffac99539-kube-api-access-8h2ld\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.061580 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10ca909a-0a73-4f62-89a4-ed8ffac99539-scripts\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.061683 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ca909a-0a73-4f62-89a4-ed8ffac99539-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.061775 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10ca909a-0a73-4f62-89a4-ed8ffac99539-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.061790 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ca909a-0a73-4f62-89a4-ed8ffac99539-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.062041 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ca909a-0a73-4f62-89a4-ed8ffac99539-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.062442 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ca909a-0a73-4f62-89a4-ed8ffac99539-config\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.062964 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10ca909a-0a73-4f62-89a4-ed8ffac99539-scripts\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.069413 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ca909a-0a73-4f62-89a4-ed8ffac99539-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.070392 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ca909a-0a73-4f62-89a4-ed8ffac99539-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.071024 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ca909a-0a73-4f62-89a4-ed8ffac99539-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.090020 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h2ld\" (UniqueName: \"kubernetes.io/projected/10ca909a-0a73-4f62-89a4-ed8ffac99539-kube-api-access-8h2ld\") pod \"ovn-northd-0\" (UID: \"10ca909a-0a73-4f62-89a4-ed8ffac99539\") " pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.154267 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.573240 4772 generic.go:334] "Generic (PLEG): container finished" podID="4b4b3176-3882-486d-8217-54f429906f49" containerID="9e9f1d8a318aec4caddd9896c9cba05e05c48fe0ea1e03725eba11e4ddfde818" exitCode=0 Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.573361 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4b4b3176-3882-486d-8217-54f429906f49","Type":"ContainerDied","Data":"9e9f1d8a318aec4caddd9896c9cba05e05c48fe0ea1e03725eba11e4ddfde818"} Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.579827 4772 generic.go:334] "Generic (PLEG): container finished" podID="5548eec2-33be-42b2-9b84-572236f095db" containerID="cfff005e3c754ef69e43a6b33dbf1cbd7d11b678b00df59f21048c6bb4dd244f" exitCode=0 Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.579914 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5548eec2-33be-42b2-9b84-572236f095db","Type":"ContainerDied","Data":"cfff005e3c754ef69e43a6b33dbf1cbd7d11b678b00df59f21048c6bb4dd244f"} Sep 30 17:19:33 crc kubenswrapper[4772]: I0930 17:19:33.697915 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 17:19:33 crc kubenswrapper[4772]: W0930 17:19:33.706362 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10ca909a_0a73_4f62_89a4_ed8ffac99539.slice/crio-ad24bd3c348fce5dccb2ed8c612888439a704099ad7393a3999d6882094e5ecb WatchSource:0}: Error finding container ad24bd3c348fce5dccb2ed8c612888439a704099ad7393a3999d6882094e5ecb: Status 404 returned error can't find the container with id ad24bd3c348fce5dccb2ed8c612888439a704099ad7393a3999d6882094e5ecb Sep 30 17:19:34 crc kubenswrapper[4772]: I0930 17:19:34.595050 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4b4b3176-3882-486d-8217-54f429906f49","Type":"ContainerStarted","Data":"414798c23c5cdad58e5a81dd924e55f55ddcbf07fce2a5cfa90449fcb49215f8"} Sep 30 17:19:34 crc kubenswrapper[4772]: I0930 17:19:34.596758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"10ca909a-0a73-4f62-89a4-ed8ffac99539","Type":"ContainerStarted","Data":"ad24bd3c348fce5dccb2ed8c612888439a704099ad7393a3999d6882094e5ecb"} Sep 30 17:19:34 crc kubenswrapper[4772]: I0930 17:19:34.599624 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5548eec2-33be-42b2-9b84-572236f095db","Type":"ContainerStarted","Data":"ead7125dcfbd74a5c95b306291b604a41d9ddc0de6f80a14cf86dcfed3b180a2"} Sep 30 17:19:34 crc kubenswrapper[4772]: I0930 17:19:34.620891 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=27.116335807 podStartE2EDuration="35.620875041s" podCreationTimestamp="2025-09-30 17:18:59 +0000 UTC" firstStartedPulling="2025-09-30 17:19:16.245198908 +0000 UTC m=+1057.152211739" lastFinishedPulling="2025-09-30 17:19:24.749738142 +0000 UTC m=+1065.656750973" observedRunningTime="2025-09-30 17:19:34.61615353 +0000 UTC m=+1075.523166361" watchObservedRunningTime="2025-09-30 17:19:34.620875041 +0000 UTC m=+1075.527887862" Sep 30 17:19:34 crc kubenswrapper[4772]: I0930 17:19:34.650147 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.148490538 podStartE2EDuration="35.650129518s" podCreationTimestamp="2025-09-30 17:18:59 +0000 UTC" firstStartedPulling="2025-09-30 17:19:16.688540437 +0000 UTC m=+1057.595553268" lastFinishedPulling="2025-09-30 17:19:25.190179417 +0000 UTC m=+1066.097192248" observedRunningTime="2025-09-30 17:19:34.6411783 +0000 UTC m=+1075.548191131" watchObservedRunningTime="2025-09-30 17:19:34.650129518 +0000 UTC m=+1075.557142349" Sep 30 17:19:36 crc kubenswrapper[4772]: I0930 17:19:36.623840 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"10ca909a-0a73-4f62-89a4-ed8ffac99539","Type":"ContainerStarted","Data":"2279008a49abb0788c14c61e021d2942f8b3091a033c8702d384d8c25d40b1db"} Sep 30 17:19:37 crc kubenswrapper[4772]: I0930 17:19:37.637356 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"10ca909a-0a73-4f62-89a4-ed8ffac99539","Type":"ContainerStarted","Data":"010ff045472f87b4295520beab0710df601a22bdad5043391f1c74798ea33e7c"} Sep 30 17:19:37 crc kubenswrapper[4772]: I0930 17:19:37.637819 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 17:19:37 crc kubenswrapper[4772]: I0930 17:19:37.639751 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:37 crc kubenswrapper[4772]: I0930 17:19:37.662607 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.257672144 podStartE2EDuration="5.662579458s" podCreationTimestamp="2025-09-30 17:19:32 +0000 UTC" firstStartedPulling="2025-09-30 17:19:33.710176469 +0000 UTC m=+1074.617189300" lastFinishedPulling="2025-09-30 17:19:36.115083783 +0000 UTC m=+1077.022096614" observedRunningTime="2025-09-30 17:19:37.654592084 +0000 UTC m=+1078.561604925" watchObservedRunningTime="2025-09-30 17:19:37.662579458 +0000 UTC m=+1078.569592329" Sep 30 17:19:38 crc kubenswrapper[4772]: I0930 17:19:38.251536 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:19:38 crc kubenswrapper[4772]: I0930 17:19:38.314374 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76b475b89f-hxnxc"] Sep 30 17:19:38 crc kubenswrapper[4772]: I0930 17:19:38.648115 4772 generic.go:334] "Generic (PLEG): container finished" podID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerID="880847e146733e46ee15523008c4cce1978586e9b6c0a83798cbe8f338b9a176" exitCode=0 Sep 30 17:19:38 crc kubenswrapper[4772]: I0930 17:19:38.648646 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" podUID="aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" containerName="dnsmasq-dns" containerID="cri-o://4308fcddf8779a541588e523658ba5e72b9933a764085f782798615764c3d532" gracePeriod=10 Sep 30 17:19:38 crc kubenswrapper[4772]: I0930 17:19:38.648280 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3d947ffa-5613-4aae-b4a9-d42094fad0ae","Type":"ContainerDied","Data":"880847e146733e46ee15523008c4cce1978586e9b6c0a83798cbe8f338b9a176"} Sep 30 17:19:40 crc kubenswrapper[4772]: I0930 17:19:40.608038 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 17:19:40 crc kubenswrapper[4772]: I0930 17:19:40.608321 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 17:19:40 crc kubenswrapper[4772]: I0930 17:19:40.667253 4772 generic.go:334] "Generic (PLEG): container finished" podID="aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" containerID="4308fcddf8779a541588e523658ba5e72b9933a764085f782798615764c3d532" exitCode=0 Sep 30 17:19:40 crc kubenswrapper[4772]: I0930 17:19:40.667303 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" event={"ID":"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8","Type":"ContainerDied","Data":"4308fcddf8779a541588e523658ba5e72b9933a764085f782798615764c3d532"} Sep 30 17:19:40 crc kubenswrapper[4772]: I0930 17:19:40.736194 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:40 crc kubenswrapper[4772]: I0930 17:19:40.736443 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:42 crc kubenswrapper[4772]: I0930 17:19:42.637285 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" podUID="aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Sep 30 17:19:43 crc kubenswrapper[4772]: I0930 17:19:43.793362 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:43 crc kubenswrapper[4772]: I0930 17:19:43.966525 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-ovsdbserver-nb\") pod \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " Sep 30 17:19:43 crc kubenswrapper[4772]: I0930 17:19:43.966688 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27b2h\" (UniqueName: \"kubernetes.io/projected/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-kube-api-access-27b2h\") pod \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " Sep 30 17:19:43 crc kubenswrapper[4772]: I0930 17:19:43.966735 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-config\") pod \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " Sep 30 17:19:43 crc kubenswrapper[4772]: I0930 17:19:43.966782 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-dns-svc\") pod \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\" (UID: \"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8\") " Sep 30 17:19:43 crc kubenswrapper[4772]: I0930 17:19:43.978237 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-kube-api-access-27b2h" (OuterVolumeSpecName: "kube-api-access-27b2h") pod "aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" (UID: "aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8"). InnerVolumeSpecName "kube-api-access-27b2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.010313 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" (UID: "aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.010977 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" (UID: "aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.021508 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-config" (OuterVolumeSpecName: "config") pod "aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" (UID: "aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.069272 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.069312 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.069326 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27b2h\" (UniqueName: \"kubernetes.io/projected/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-kube-api-access-27b2h\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.069337 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.708837 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.709498 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b475b89f-hxnxc" event={"ID":"aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8","Type":"ContainerDied","Data":"0318b1ae1c6110c68dc0f530b717548920b18e2cef75e1affe6df0700d8978e4"} Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.709577 4772 scope.go:117] "RemoveContainer" containerID="4308fcddf8779a541588e523658ba5e72b9933a764085f782798615764c3d532" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.740527 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76b475b89f-hxnxc"] Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.743210 4772 scope.go:117] "RemoveContainer" containerID="cc709c2547b5c4831e309f72842cc5b1337ec10c73814e9e06a9fb42db48b63f" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.747424 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76b475b89f-hxnxc"] Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.830415 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:44 crc kubenswrapper[4772]: I0930 17:19:44.902001 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 17:19:45 crc kubenswrapper[4772]: I0930 17:19:45.911657 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" path="/var/lib/kubelet/pods/aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8/volumes" Sep 30 17:19:46 crc kubenswrapper[4772]: I0930 17:19:46.691160 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 17:19:46 crc kubenswrapper[4772]: I0930 17:19:46.730241 4772 generic.go:334] "Generic (PLEG): container finished" podID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" containerID="86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987" exitCode=0 Sep 30 17:19:46 crc kubenswrapper[4772]: I0930 17:19:46.730314 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9e90f254-e3e7-4c4f-acfe-1a251e7682df","Type":"ContainerDied","Data":"86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987"} Sep 30 17:19:46 crc kubenswrapper[4772]: I0930 17:19:46.732317 4772 generic.go:334] "Generic (PLEG): container finished" podID="607217cf-8f90-4adb-bca7-0271ea8a7b9b" containerID="1ad1ee84c662a306822379ff76ab47a3d264f59ac645289fc7b612396971bbe8" exitCode=0 Sep 30 17:19:46 crc kubenswrapper[4772]: I0930 17:19:46.732353 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"607217cf-8f90-4adb-bca7-0271ea8a7b9b","Type":"ContainerDied","Data":"1ad1ee84c662a306822379ff76ab47a3d264f59ac645289fc7b612396971bbe8"} Sep 30 17:19:46 crc kubenswrapper[4772]: I0930 17:19:46.776345 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 17:19:47 crc kubenswrapper[4772]: I0930 17:19:47.741112 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9e90f254-e3e7-4c4f-acfe-1a251e7682df","Type":"ContainerStarted","Data":"ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f"} Sep 30 17:19:47 crc kubenswrapper[4772]: I0930 17:19:47.741576 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:19:47 crc kubenswrapper[4772]: I0930 17:19:47.744296 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3d947ffa-5613-4aae-b4a9-d42094fad0ae","Type":"ContainerStarted","Data":"cbc3dae9d8b7c7fd7423aa2326587afe27a88c29c1d6e316093f3e5aba0c5c5a"} Sep 30 17:19:47 crc kubenswrapper[4772]: I0930 17:19:47.745929 4772 generic.go:334] "Generic (PLEG): container finished" podID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerID="901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4" exitCode=0 Sep 30 17:19:47 crc kubenswrapper[4772]: I0930 17:19:47.745989 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c0788e86-24b4-421d-98c9-12f0a8e52740","Type":"ContainerDied","Data":"901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4"} Sep 30 17:19:47 crc kubenswrapper[4772]: I0930 17:19:47.749601 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"607217cf-8f90-4adb-bca7-0271ea8a7b9b","Type":"ContainerStarted","Data":"b47d3ad20b134fa64d29ee73e6571fb13b3f84d222e4d00f412b6b01c8b91790"} Sep 30 17:19:47 crc kubenswrapper[4772]: I0930 17:19:47.749766 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:19:47 crc kubenswrapper[4772]: I0930 17:19:47.764767 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.456120385 podStartE2EDuration="51.764750118s" podCreationTimestamp="2025-09-30 17:18:56 +0000 UTC" firstStartedPulling="2025-09-30 17:18:58.360971636 +0000 UTC m=+1039.267984467" lastFinishedPulling="2025-09-30 17:19:15.669601369 +0000 UTC m=+1056.576614200" observedRunningTime="2025-09-30 17:19:47.761445934 +0000 UTC m=+1088.668458765" watchObservedRunningTime="2025-09-30 17:19:47.764750118 +0000 UTC m=+1088.671762949" Sep 30 17:19:47 crc kubenswrapper[4772]: I0930 17:19:47.790313 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=35.495786068 podStartE2EDuration="52.790295231s" podCreationTimestamp="2025-09-30 17:18:55 +0000 UTC" firstStartedPulling="2025-09-30 17:18:58.43472158 +0000 UTC m=+1039.341734411" lastFinishedPulling="2025-09-30 17:19:15.729230743 +0000 UTC m=+1056.636243574" observedRunningTime="2025-09-30 17:19:47.783135458 +0000 UTC m=+1088.690148279" watchObservedRunningTime="2025-09-30 17:19:47.790295231 +0000 UTC m=+1088.697308062" Sep 30 17:19:48 crc kubenswrapper[4772]: I0930 17:19:48.226525 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 17:19:48 crc kubenswrapper[4772]: I0930 17:19:48.757791 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c0788e86-24b4-421d-98c9-12f0a8e52740","Type":"ContainerStarted","Data":"873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543"} Sep 30 17:19:48 crc kubenswrapper[4772]: I0930 17:19:48.783113 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.186843708 podStartE2EDuration="52.783095651s" podCreationTimestamp="2025-09-30 17:18:56 +0000 UTC" firstStartedPulling="2025-09-30 17:18:58.176937913 +0000 UTC m=+1039.083950744" lastFinishedPulling="2025-09-30 17:19:15.773189846 +0000 UTC m=+1056.680202687" observedRunningTime="2025-09-30 17:19:48.779006907 +0000 UTC m=+1089.686019738" watchObservedRunningTime="2025-09-30 17:19:48.783095651 +0000 UTC m=+1089.690108482" Sep 30 17:19:49 crc kubenswrapper[4772]: I0930 17:19:49.769734 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3d947ffa-5613-4aae-b4a9-d42094fad0ae","Type":"ContainerStarted","Data":"3a8bc55220d97adb7aaaa3f003c5f559b4015838c40cd82b1631dcd48c2c280e"} Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.610100 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jw52p"] Sep 30 17:19:50 crc kubenswrapper[4772]: E0930 17:19:50.610408 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" containerName="init" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.610423 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" containerName="init" Sep 30 17:19:50 crc kubenswrapper[4772]: E0930 17:19:50.610456 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" containerName="dnsmasq-dns" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.610462 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" containerName="dnsmasq-dns" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.610640 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa40346a-b8ee-4fb2-aa9b-d5ded01a3ba8" containerName="dnsmasq-dns" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.611258 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jw52p" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.623939 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jw52p"] Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.684744 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x4r4\" (UniqueName: \"kubernetes.io/projected/e2815398-98f4-48d9-9e2a-54f25ac3fd0c-kube-api-access-7x4r4\") pod \"keystone-db-create-jw52p\" (UID: \"e2815398-98f4-48d9-9e2a-54f25ac3fd0c\") " pod="openstack/keystone-db-create-jw52p" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.786745 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x4r4\" (UniqueName: \"kubernetes.io/projected/e2815398-98f4-48d9-9e2a-54f25ac3fd0c-kube-api-access-7x4r4\") pod \"keystone-db-create-jw52p\" (UID: \"e2815398-98f4-48d9-9e2a-54f25ac3fd0c\") " pod="openstack/keystone-db-create-jw52p" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.810079 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-srdzt"] Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.811517 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-srdzt" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.825496 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-srdzt"] Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.828256 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x4r4\" (UniqueName: \"kubernetes.io/projected/e2815398-98f4-48d9-9e2a-54f25ac3fd0c-kube-api-access-7x4r4\") pod \"keystone-db-create-jw52p\" (UID: \"e2815398-98f4-48d9-9e2a-54f25ac3fd0c\") " pod="openstack/keystone-db-create-jw52p" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.888340 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv68l\" (UniqueName: \"kubernetes.io/projected/950feb31-d399-4491-a4e4-365371d0d2b6-kube-api-access-wv68l\") pod \"placement-db-create-srdzt\" (UID: \"950feb31-d399-4491-a4e4-365371d0d2b6\") " pod="openstack/placement-db-create-srdzt" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.934107 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jw52p" Sep 30 17:19:50 crc kubenswrapper[4772]: I0930 17:19:50.990246 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv68l\" (UniqueName: \"kubernetes.io/projected/950feb31-d399-4491-a4e4-365371d0d2b6-kube-api-access-wv68l\") pod \"placement-db-create-srdzt\" (UID: \"950feb31-d399-4491-a4e4-365371d0d2b6\") " pod="openstack/placement-db-create-srdzt" Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.010106 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv68l\" (UniqueName: \"kubernetes.io/projected/950feb31-d399-4491-a4e4-365371d0d2b6-kube-api-access-wv68l\") pod \"placement-db-create-srdzt\" (UID: \"950feb31-d399-4491-a4e4-365371d0d2b6\") " pod="openstack/placement-db-create-srdzt" Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.162125 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-srdzt" Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.163877 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dqw5r"] Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.164860 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dqw5r" Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.171405 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dqw5r"] Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.192950 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp9hs\" (UniqueName: \"kubernetes.io/projected/9ca281cc-c87b-4f62-8d9c-12373a1dc085-kube-api-access-cp9hs\") pod \"glance-db-create-dqw5r\" (UID: \"9ca281cc-c87b-4f62-8d9c-12373a1dc085\") " pod="openstack/glance-db-create-dqw5r" Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.296458 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp9hs\" (UniqueName: \"kubernetes.io/projected/9ca281cc-c87b-4f62-8d9c-12373a1dc085-kube-api-access-cp9hs\") pod \"glance-db-create-dqw5r\" (UID: \"9ca281cc-c87b-4f62-8d9c-12373a1dc085\") " pod="openstack/glance-db-create-dqw5r" Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.319131 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp9hs\" (UniqueName: \"kubernetes.io/projected/9ca281cc-c87b-4f62-8d9c-12373a1dc085-kube-api-access-cp9hs\") pod \"glance-db-create-dqw5r\" (UID: \"9ca281cc-c87b-4f62-8d9c-12373a1dc085\") " pod="openstack/glance-db-create-dqw5r" Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.414219 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jw52p"] Sep 30 17:19:51 crc kubenswrapper[4772]: W0930 17:19:51.426029 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2815398_98f4_48d9_9e2a_54f25ac3fd0c.slice/crio-1a1338cbe77ec389dee1f73227a6e8e908c6d6bdcf430f0dd2b6a93d15f26cdf WatchSource:0}: Error finding container 1a1338cbe77ec389dee1f73227a6e8e908c6d6bdcf430f0dd2b6a93d15f26cdf: Status 404 returned error can't find the container with id 1a1338cbe77ec389dee1f73227a6e8e908c6d6bdcf430f0dd2b6a93d15f26cdf Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.486178 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dqw5r" Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.668049 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-srdzt"] Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.788962 4772 generic.go:334] "Generic (PLEG): container finished" podID="e2815398-98f4-48d9-9e2a-54f25ac3fd0c" containerID="4b05e0ddeedcc0fa91d63b33ca724dccd8da801a73e685d0e56791938aecbd32" exitCode=0 Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.789018 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jw52p" event={"ID":"e2815398-98f4-48d9-9e2a-54f25ac3fd0c","Type":"ContainerDied","Data":"4b05e0ddeedcc0fa91d63b33ca724dccd8da801a73e685d0e56791938aecbd32"} Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.789094 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jw52p" event={"ID":"e2815398-98f4-48d9-9e2a-54f25ac3fd0c","Type":"ContainerStarted","Data":"1a1338cbe77ec389dee1f73227a6e8e908c6d6bdcf430f0dd2b6a93d15f26cdf"} Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.791536 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-srdzt" event={"ID":"950feb31-d399-4491-a4e4-365371d0d2b6","Type":"ContainerStarted","Data":"70d58674a874f929c0e99be3bf0466f3ca9ac6e71c34855ccbba1561f9be4b9c"} Sep 30 17:19:51 crc kubenswrapper[4772]: I0930 17:19:51.972890 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dqw5r"] Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.646557 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-lzvcs"] Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.648672 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-lzvcs" Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.668691 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-lzvcs"] Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.718945 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfq9q\" (UniqueName: \"kubernetes.io/projected/09b10070-728b-4128-8faa-10b567e20342-kube-api-access-hfq9q\") pod \"watcher-db-create-lzvcs\" (UID: \"09b10070-728b-4128-8faa-10b567e20342\") " pod="openstack/watcher-db-create-lzvcs" Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.803240 4772 generic.go:334] "Generic (PLEG): container finished" podID="950feb31-d399-4491-a4e4-365371d0d2b6" containerID="b4ce15c683d6c40ea99bc219a4e2bf12056e181a564d45c78a204e89a72d873c" exitCode=0 Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.803315 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-srdzt" event={"ID":"950feb31-d399-4491-a4e4-365371d0d2b6","Type":"ContainerDied","Data":"b4ce15c683d6c40ea99bc219a4e2bf12056e181a564d45c78a204e89a72d873c"} Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.805200 4772 generic.go:334] "Generic (PLEG): container finished" podID="9ca281cc-c87b-4f62-8d9c-12373a1dc085" containerID="78db73365fb6aaf5ca3d228cd401bb07e9acabc5e908def1b1d31f89ab2dec5c" exitCode=0 Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.805423 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dqw5r" event={"ID":"9ca281cc-c87b-4f62-8d9c-12373a1dc085","Type":"ContainerDied","Data":"78db73365fb6aaf5ca3d228cd401bb07e9acabc5e908def1b1d31f89ab2dec5c"} Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.805450 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dqw5r" event={"ID":"9ca281cc-c87b-4f62-8d9c-12373a1dc085","Type":"ContainerStarted","Data":"9fe13bc983e04504dfa87ded710f80521187afcc2a846524caccea5a845eff48"} Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.821008 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfq9q\" (UniqueName: \"kubernetes.io/projected/09b10070-728b-4128-8faa-10b567e20342-kube-api-access-hfq9q\") pod \"watcher-db-create-lzvcs\" (UID: \"09b10070-728b-4128-8faa-10b567e20342\") " pod="openstack/watcher-db-create-lzvcs" Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.862506 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfq9q\" (UniqueName: \"kubernetes.io/projected/09b10070-728b-4128-8faa-10b567e20342-kube-api-access-hfq9q\") pod \"watcher-db-create-lzvcs\" (UID: \"09b10070-728b-4128-8faa-10b567e20342\") " pod="openstack/watcher-db-create-lzvcs" Sep 30 17:19:52 crc kubenswrapper[4772]: I0930 17:19:52.975847 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-lzvcs" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.077403 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jw52p" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.085028 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dqw5r" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.093907 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x4r4\" (UniqueName: \"kubernetes.io/projected/e2815398-98f4-48d9-9e2a-54f25ac3fd0c-kube-api-access-7x4r4\") pod \"e2815398-98f4-48d9-9e2a-54f25ac3fd0c\" (UID: \"e2815398-98f4-48d9-9e2a-54f25ac3fd0c\") " Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.096377 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-srdzt" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.100779 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2815398-98f4-48d9-9e2a-54f25ac3fd0c-kube-api-access-7x4r4" (OuterVolumeSpecName: "kube-api-access-7x4r4") pod "e2815398-98f4-48d9-9e2a-54f25ac3fd0c" (UID: "e2815398-98f4-48d9-9e2a-54f25ac3fd0c"). InnerVolumeSpecName "kube-api-access-7x4r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.195190 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp9hs\" (UniqueName: \"kubernetes.io/projected/9ca281cc-c87b-4f62-8d9c-12373a1dc085-kube-api-access-cp9hs\") pod \"9ca281cc-c87b-4f62-8d9c-12373a1dc085\" (UID: \"9ca281cc-c87b-4f62-8d9c-12373a1dc085\") " Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.195765 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv68l\" (UniqueName: \"kubernetes.io/projected/950feb31-d399-4491-a4e4-365371d0d2b6-kube-api-access-wv68l\") pod \"950feb31-d399-4491-a4e4-365371d0d2b6\" (UID: \"950feb31-d399-4491-a4e4-365371d0d2b6\") " Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.196850 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x4r4\" (UniqueName: \"kubernetes.io/projected/e2815398-98f4-48d9-9e2a-54f25ac3fd0c-kube-api-access-7x4r4\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.198576 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca281cc-c87b-4f62-8d9c-12373a1dc085-kube-api-access-cp9hs" (OuterVolumeSpecName: "kube-api-access-cp9hs") pod "9ca281cc-c87b-4f62-8d9c-12373a1dc085" (UID: "9ca281cc-c87b-4f62-8d9c-12373a1dc085"). InnerVolumeSpecName "kube-api-access-cp9hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.206570 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/950feb31-d399-4491-a4e4-365371d0d2b6-kube-api-access-wv68l" (OuterVolumeSpecName: "kube-api-access-wv68l") pod "950feb31-d399-4491-a4e4-365371d0d2b6" (UID: "950feb31-d399-4491-a4e4-365371d0d2b6"). InnerVolumeSpecName "kube-api-access-wv68l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.215743 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="607217cf-8f90-4adb-bca7-0271ea8a7b9b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.298833 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp9hs\" (UniqueName: \"kubernetes.io/projected/9ca281cc-c87b-4f62-8d9c-12373a1dc085-kube-api-access-cp9hs\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.298878 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv68l\" (UniqueName: \"kubernetes.io/projected/950feb31-d399-4491-a4e4-365371d0d2b6-kube-api-access-wv68l\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.403498 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-lzvcs"] Sep 30 17:19:57 crc kubenswrapper[4772]: W0930 17:19:57.413130 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09b10070_728b_4128_8faa_10b567e20342.slice/crio-170ef72c313133780d3b702faf10ffa2953610a99ab24c7f5bba4e01c7a6be81 WatchSource:0}: Error finding container 170ef72c313133780d3b702faf10ffa2953610a99ab24c7f5bba4e01c7a6be81: Status 404 returned error can't find the container with id 170ef72c313133780d3b702faf10ffa2953610a99ab24c7f5bba4e01c7a6be81 Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.548669 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.550640 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.551080 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.845001 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dqw5r" event={"ID":"9ca281cc-c87b-4f62-8d9c-12373a1dc085","Type":"ContainerDied","Data":"9fe13bc983e04504dfa87ded710f80521187afcc2a846524caccea5a845eff48"} Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.845045 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dqw5r" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.845082 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fe13bc983e04504dfa87ded710f80521187afcc2a846524caccea5a845eff48" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.848250 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3d947ffa-5613-4aae-b4a9-d42094fad0ae","Type":"ContainerStarted","Data":"b86b85d7a96ffbf9e67a24c72cad1af56de311652111c2311216f7bf53f76f91"} Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.850532 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jw52p" event={"ID":"e2815398-98f4-48d9-9e2a-54f25ac3fd0c","Type":"ContainerDied","Data":"1a1338cbe77ec389dee1f73227a6e8e908c6d6bdcf430f0dd2b6a93d15f26cdf"} Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.850584 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a1338cbe77ec389dee1f73227a6e8e908c6d6bdcf430f0dd2b6a93d15f26cdf" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.850693 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jw52p" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.854247 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-srdzt" event={"ID":"950feb31-d399-4491-a4e4-365371d0d2b6","Type":"ContainerDied","Data":"70d58674a874f929c0e99be3bf0466f3ca9ac6e71c34855ccbba1561f9be4b9c"} Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.854258 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-srdzt" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.854267 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70d58674a874f929c0e99be3bf0466f3ca9ac6e71c34855ccbba1561f9be4b9c" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.855633 4772 generic.go:334] "Generic (PLEG): container finished" podID="09b10070-728b-4128-8faa-10b567e20342" containerID="997add3da0e114db31ad8dcdeb005e9fd720aeaa4e6e6d7381e2647af2972623" exitCode=0 Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.855663 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-lzvcs" event={"ID":"09b10070-728b-4128-8faa-10b567e20342","Type":"ContainerDied","Data":"997add3da0e114db31ad8dcdeb005e9fd720aeaa4e6e6d7381e2647af2972623"} Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.855677 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-lzvcs" event={"ID":"09b10070-728b-4128-8faa-10b567e20342","Type":"ContainerStarted","Data":"170ef72c313133780d3b702faf10ffa2953610a99ab24c7f5bba4e01c7a6be81"} Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.888966 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.358561049 podStartE2EDuration="55.88893132s" podCreationTimestamp="2025-09-30 17:19:02 +0000 UTC" firstStartedPulling="2025-09-30 17:19:16.736586155 +0000 UTC m=+1057.643598986" lastFinishedPulling="2025-09-30 17:19:57.266956426 +0000 UTC m=+1098.173969257" observedRunningTime="2025-09-30 17:19:57.882558248 +0000 UTC m=+1098.789571079" watchObservedRunningTime="2025-09-30 17:19:57.88893132 +0000 UTC m=+1098.795944191" Sep 30 17:19:57 crc kubenswrapper[4772]: I0930 17:19:57.926940 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Sep 30 17:19:58 crc kubenswrapper[4772]: I0930 17:19:58.960148 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 17:19:59 crc kubenswrapper[4772]: I0930 17:19:59.283947 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-lzvcs" Sep 30 17:19:59 crc kubenswrapper[4772]: I0930 17:19:59.457189 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfq9q\" (UniqueName: \"kubernetes.io/projected/09b10070-728b-4128-8faa-10b567e20342-kube-api-access-hfq9q\") pod \"09b10070-728b-4128-8faa-10b567e20342\" (UID: \"09b10070-728b-4128-8faa-10b567e20342\") " Sep 30 17:19:59 crc kubenswrapper[4772]: I0930 17:19:59.467247 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b10070-728b-4128-8faa-10b567e20342-kube-api-access-hfq9q" (OuterVolumeSpecName: "kube-api-access-hfq9q") pod "09b10070-728b-4128-8faa-10b567e20342" (UID: "09b10070-728b-4128-8faa-10b567e20342"). InnerVolumeSpecName "kube-api-access-hfq9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:19:59 crc kubenswrapper[4772]: I0930 17:19:59.559487 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfq9q\" (UniqueName: \"kubernetes.io/projected/09b10070-728b-4128-8faa-10b567e20342-kube-api-access-hfq9q\") on node \"crc\" DevicePath \"\"" Sep 30 17:19:59 crc kubenswrapper[4772]: I0930 17:19:59.876901 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-lzvcs" event={"ID":"09b10070-728b-4128-8faa-10b567e20342","Type":"ContainerDied","Data":"170ef72c313133780d3b702faf10ffa2953610a99ab24c7f5bba4e01c7a6be81"} Sep 30 17:19:59 crc kubenswrapper[4772]: I0930 17:19:59.876961 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="170ef72c313133780d3b702faf10ffa2953610a99ab24c7f5bba4e01c7a6be81" Sep 30 17:19:59 crc kubenswrapper[4772]: I0930 17:19:59.876924 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-lzvcs" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.769644 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.769970 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-t5kwk" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.774374 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6v6fm" podUID="d66affdf-221c-4a29-a1f7-0c3d7e4d4153" containerName="ovn-controller" probeResult="failure" output=< Sep 30 17:20:00 crc kubenswrapper[4772]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 17:20:00 crc kubenswrapper[4772]: > Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.962101 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1272-account-create-42x7w"] Sep 30 17:20:00 crc kubenswrapper[4772]: E0930 17:20:00.962429 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b10070-728b-4128-8faa-10b567e20342" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.962447 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b10070-728b-4128-8faa-10b567e20342" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: E0930 17:20:00.962466 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca281cc-c87b-4f62-8d9c-12373a1dc085" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.962475 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca281cc-c87b-4f62-8d9c-12373a1dc085" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: E0930 17:20:00.962488 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2815398-98f4-48d9-9e2a-54f25ac3fd0c" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.962496 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2815398-98f4-48d9-9e2a-54f25ac3fd0c" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: E0930 17:20:00.962510 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950feb31-d399-4491-a4e4-365371d0d2b6" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.962516 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="950feb31-d399-4491-a4e4-365371d0d2b6" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.962691 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2815398-98f4-48d9-9e2a-54f25ac3fd0c" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.962706 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b10070-728b-4128-8faa-10b567e20342" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.962716 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca281cc-c87b-4f62-8d9c-12373a1dc085" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.962733 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="950feb31-d399-4491-a4e4-365371d0d2b6" containerName="mariadb-database-create" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.963303 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1272-account-create-42x7w" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.968968 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 17:20:00 crc kubenswrapper[4772]: I0930 17:20:00.973354 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1272-account-create-42x7w"] Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.017010 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6v6fm-config-z6tl8"] Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.018927 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.023899 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.035164 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6v6fm-config-z6tl8"] Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.083284 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjdx\" (UniqueName: \"kubernetes.io/projected/346d6dc6-45fc-4534-848a-181c95a3c790-kube-api-access-cfjdx\") pod \"placement-1272-account-create-42x7w\" (UID: \"346d6dc6-45fc-4534-848a-181c95a3c790\") " pod="openstack/placement-1272-account-create-42x7w" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.186099 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run-ovn\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.186167 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-scripts\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.186228 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjdx\" (UniqueName: \"kubernetes.io/projected/346d6dc6-45fc-4534-848a-181c95a3c790-kube-api-access-cfjdx\") pod \"placement-1272-account-create-42x7w\" (UID: \"346d6dc6-45fc-4534-848a-181c95a3c790\") " pod="openstack/placement-1272-account-create-42x7w" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.186277 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvd2\" (UniqueName: \"kubernetes.io/projected/7d9ac0f7-6e58-4429-b287-9979084be549-kube-api-access-sjvd2\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.186302 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.186337 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-log-ovn\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.186353 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-additional-scripts\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.207898 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjdx\" (UniqueName: \"kubernetes.io/projected/346d6dc6-45fc-4534-848a-181c95a3c790-kube-api-access-cfjdx\") pod \"placement-1272-account-create-42x7w\" (UID: \"346d6dc6-45fc-4534-848a-181c95a3c790\") " pod="openstack/placement-1272-account-create-42x7w" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.279997 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1272-account-create-42x7w" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.287422 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.287484 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-log-ovn\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.287511 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-additional-scripts\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.287566 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run-ovn\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.287601 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-scripts\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.287659 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvd2\" (UniqueName: \"kubernetes.io/projected/7d9ac0f7-6e58-4429-b287-9979084be549-kube-api-access-sjvd2\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.288888 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run-ovn\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.288942 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-log-ovn\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.289248 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.289631 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-additional-scripts\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.290960 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-scripts\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.310182 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvd2\" (UniqueName: \"kubernetes.io/projected/7d9ac0f7-6e58-4429-b287-9979084be549-kube-api-access-sjvd2\") pod \"ovn-controller-6v6fm-config-z6tl8\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.339188 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.340966 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9da5-account-create-dprs7"] Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.341998 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9da5-account-create-dprs7" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.345986 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.386185 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9da5-account-create-dprs7"] Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.497679 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbmkz\" (UniqueName: \"kubernetes.io/projected/4f847bcf-eda9-4d03-8b13-c7688bdeaf31-kube-api-access-rbmkz\") pod \"glance-9da5-account-create-dprs7\" (UID: \"4f847bcf-eda9-4d03-8b13-c7688bdeaf31\") " pod="openstack/glance-9da5-account-create-dprs7" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.599670 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbmkz\" (UniqueName: \"kubernetes.io/projected/4f847bcf-eda9-4d03-8b13-c7688bdeaf31-kube-api-access-rbmkz\") pod \"glance-9da5-account-create-dprs7\" (UID: \"4f847bcf-eda9-4d03-8b13-c7688bdeaf31\") " pod="openstack/glance-9da5-account-create-dprs7" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.630838 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbmkz\" (UniqueName: \"kubernetes.io/projected/4f847bcf-eda9-4d03-8b13-c7688bdeaf31-kube-api-access-rbmkz\") pod \"glance-9da5-account-create-dprs7\" (UID: \"4f847bcf-eda9-4d03-8b13-c7688bdeaf31\") " pod="openstack/glance-9da5-account-create-dprs7" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.764202 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9da5-account-create-dprs7" Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.834621 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6v6fm-config-z6tl8"] Sep 30 17:20:01 crc kubenswrapper[4772]: W0930 17:20:01.843158 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d9ac0f7_6e58_4429_b287_9979084be549.slice/crio-f006f45bfda94a3d4189555eb9b5e2ab73ca82f47258596b018f3fc751e9d622 WatchSource:0}: Error finding container f006f45bfda94a3d4189555eb9b5e2ab73ca82f47258596b018f3fc751e9d622: Status 404 returned error can't find the container with id f006f45bfda94a3d4189555eb9b5e2ab73ca82f47258596b018f3fc751e9d622 Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.862354 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1272-account-create-42x7w"] Sep 30 17:20:01 crc kubenswrapper[4772]: W0930 17:20:01.900023 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod346d6dc6_45fc_4534_848a_181c95a3c790.slice/crio-82005e07761693bde86d6df1998617075ff2bd74b380b888081df8f18beb368f WatchSource:0}: Error finding container 82005e07761693bde86d6df1998617075ff2bd74b380b888081df8f18beb368f: Status 404 returned error can't find the container with id 82005e07761693bde86d6df1998617075ff2bd74b380b888081df8f18beb368f Sep 30 17:20:01 crc kubenswrapper[4772]: I0930 17:20:01.913079 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6v6fm-config-z6tl8" event={"ID":"7d9ac0f7-6e58-4429-b287-9979084be549","Type":"ContainerStarted","Data":"f006f45bfda94a3d4189555eb9b5e2ab73ca82f47258596b018f3fc751e9d622"} Sep 30 17:20:02 crc kubenswrapper[4772]: I0930 17:20:02.294737 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9da5-account-create-dprs7"] Sep 30 17:20:02 crc kubenswrapper[4772]: I0930 17:20:02.909693 4772 generic.go:334] "Generic (PLEG): container finished" podID="4f847bcf-eda9-4d03-8b13-c7688bdeaf31" containerID="1c70fcad8c30ec41a32bafc76e313a025ebb1b6e92442c198bafbbf0e5fd5559" exitCode=0 Sep 30 17:20:02 crc kubenswrapper[4772]: I0930 17:20:02.909792 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9da5-account-create-dprs7" event={"ID":"4f847bcf-eda9-4d03-8b13-c7688bdeaf31","Type":"ContainerDied","Data":"1c70fcad8c30ec41a32bafc76e313a025ebb1b6e92442c198bafbbf0e5fd5559"} Sep 30 17:20:02 crc kubenswrapper[4772]: I0930 17:20:02.910039 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9da5-account-create-dprs7" event={"ID":"4f847bcf-eda9-4d03-8b13-c7688bdeaf31","Type":"ContainerStarted","Data":"eaa6e372effe419a15b4e27449c5489ce276fa647ed045c2855e242d89639fee"} Sep 30 17:20:02 crc kubenswrapper[4772]: I0930 17:20:02.912379 4772 generic.go:334] "Generic (PLEG): container finished" podID="346d6dc6-45fc-4534-848a-181c95a3c790" containerID="db5afbfcea9ec7c1286711ea88e82dad4d7a7f820076f7a86e2d90353a7fa099" exitCode=0 Sep 30 17:20:02 crc kubenswrapper[4772]: I0930 17:20:02.912406 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1272-account-create-42x7w" event={"ID":"346d6dc6-45fc-4534-848a-181c95a3c790","Type":"ContainerDied","Data":"db5afbfcea9ec7c1286711ea88e82dad4d7a7f820076f7a86e2d90353a7fa099"} Sep 30 17:20:02 crc kubenswrapper[4772]: I0930 17:20:02.912433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1272-account-create-42x7w" event={"ID":"346d6dc6-45fc-4534-848a-181c95a3c790","Type":"ContainerStarted","Data":"82005e07761693bde86d6df1998617075ff2bd74b380b888081df8f18beb368f"} Sep 30 17:20:02 crc kubenswrapper[4772]: I0930 17:20:02.914976 4772 generic.go:334] "Generic (PLEG): container finished" podID="7d9ac0f7-6e58-4429-b287-9979084be549" containerID="120b6fbb659512b040617459d7216ef829d6dfd06f7fdeae10245601f3a3a6c9" exitCode=0 Sep 30 17:20:02 crc kubenswrapper[4772]: I0930 17:20:02.915005 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6v6fm-config-z6tl8" event={"ID":"7d9ac0f7-6e58-4429-b287-9979084be549","Type":"ContainerDied","Data":"120b6fbb659512b040617459d7216ef829d6dfd06f7fdeae10245601f3a3a6c9"} Sep 30 17:20:03 crc kubenswrapper[4772]: I0930 17:20:03.961631 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:03 crc kubenswrapper[4772]: I0930 17:20:03.964399 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.292512 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9da5-account-create-dprs7" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.382724 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.390634 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1272-account-create-42x7w" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.445616 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbmkz\" (UniqueName: \"kubernetes.io/projected/4f847bcf-eda9-4d03-8b13-c7688bdeaf31-kube-api-access-rbmkz\") pod \"4f847bcf-eda9-4d03-8b13-c7688bdeaf31\" (UID: \"4f847bcf-eda9-4d03-8b13-c7688bdeaf31\") " Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.451966 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f847bcf-eda9-4d03-8b13-c7688bdeaf31-kube-api-access-rbmkz" (OuterVolumeSpecName: "kube-api-access-rbmkz") pod "4f847bcf-eda9-4d03-8b13-c7688bdeaf31" (UID: "4f847bcf-eda9-4d03-8b13-c7688bdeaf31"). InnerVolumeSpecName "kube-api-access-rbmkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.546885 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-scripts\") pod \"7d9ac0f7-6e58-4429-b287-9979084be549\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.547209 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-log-ovn\") pod \"7d9ac0f7-6e58-4429-b287-9979084be549\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.547257 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run\") pod \"7d9ac0f7-6e58-4429-b287-9979084be549\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.547348 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfjdx\" (UniqueName: \"kubernetes.io/projected/346d6dc6-45fc-4534-848a-181c95a3c790-kube-api-access-cfjdx\") pod \"346d6dc6-45fc-4534-848a-181c95a3c790\" (UID: \"346d6dc6-45fc-4534-848a-181c95a3c790\") " Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.547392 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run-ovn\") pod \"7d9ac0f7-6e58-4429-b287-9979084be549\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.547408 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjvd2\" (UniqueName: \"kubernetes.io/projected/7d9ac0f7-6e58-4429-b287-9979084be549-kube-api-access-sjvd2\") pod \"7d9ac0f7-6e58-4429-b287-9979084be549\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.547428 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-additional-scripts\") pod \"7d9ac0f7-6e58-4429-b287-9979084be549\" (UID: \"7d9ac0f7-6e58-4429-b287-9979084be549\") " Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.547420 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7d9ac0f7-6e58-4429-b287-9979084be549" (UID: "7d9ac0f7-6e58-4429-b287-9979084be549"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.547509 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7d9ac0f7-6e58-4429-b287-9979084be549" (UID: "7d9ac0f7-6e58-4429-b287-9979084be549"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.547554 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run" (OuterVolumeSpecName: "var-run") pod "7d9ac0f7-6e58-4429-b287-9979084be549" (UID: "7d9ac0f7-6e58-4429-b287-9979084be549"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.548235 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7d9ac0f7-6e58-4429-b287-9979084be549" (UID: "7d9ac0f7-6e58-4429-b287-9979084be549"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.548322 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-scripts" (OuterVolumeSpecName: "scripts") pod "7d9ac0f7-6e58-4429-b287-9979084be549" (UID: "7d9ac0f7-6e58-4429-b287-9979084be549"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.548794 4772 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.548821 4772 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.548835 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbmkz\" (UniqueName: \"kubernetes.io/projected/4f847bcf-eda9-4d03-8b13-c7688bdeaf31-kube-api-access-rbmkz\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.548846 4772 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.548855 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d9ac0f7-6e58-4429-b287-9979084be549-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.548865 4772 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d9ac0f7-6e58-4429-b287-9979084be549-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.550955 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346d6dc6-45fc-4534-848a-181c95a3c790-kube-api-access-cfjdx" (OuterVolumeSpecName: "kube-api-access-cfjdx") pod "346d6dc6-45fc-4534-848a-181c95a3c790" (UID: "346d6dc6-45fc-4534-848a-181c95a3c790"). InnerVolumeSpecName "kube-api-access-cfjdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.551094 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9ac0f7-6e58-4429-b287-9979084be549-kube-api-access-sjvd2" (OuterVolumeSpecName: "kube-api-access-sjvd2") pod "7d9ac0f7-6e58-4429-b287-9979084be549" (UID: "7d9ac0f7-6e58-4429-b287-9979084be549"). InnerVolumeSpecName "kube-api-access-sjvd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.649939 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfjdx\" (UniqueName: \"kubernetes.io/projected/346d6dc6-45fc-4534-848a-181c95a3c790-kube-api-access-cfjdx\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.650189 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjvd2\" (UniqueName: \"kubernetes.io/projected/7d9ac0f7-6e58-4429-b287-9979084be549-kube-api-access-sjvd2\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.932269 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9da5-account-create-dprs7" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.932250 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9da5-account-create-dprs7" event={"ID":"4f847bcf-eda9-4d03-8b13-c7688bdeaf31","Type":"ContainerDied","Data":"eaa6e372effe419a15b4e27449c5489ce276fa647ed045c2855e242d89639fee"} Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.932414 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaa6e372effe419a15b4e27449c5489ce276fa647ed045c2855e242d89639fee" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.934051 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1272-account-create-42x7w" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.934041 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1272-account-create-42x7w" event={"ID":"346d6dc6-45fc-4534-848a-181c95a3c790","Type":"ContainerDied","Data":"82005e07761693bde86d6df1998617075ff2bd74b380b888081df8f18beb368f"} Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.934118 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82005e07761693bde86d6df1998617075ff2bd74b380b888081df8f18beb368f" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.935886 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6v6fm-config-z6tl8" event={"ID":"7d9ac0f7-6e58-4429-b287-9979084be549","Type":"ContainerDied","Data":"f006f45bfda94a3d4189555eb9b5e2ab73ca82f47258596b018f3fc751e9d622"} Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.935962 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f006f45bfda94a3d4189555eb9b5e2ab73ca82f47258596b018f3fc751e9d622" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.935905 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6v6fm-config-z6tl8" Sep 30 17:20:04 crc kubenswrapper[4772]: I0930 17:20:04.937429 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.491588 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6v6fm-config-z6tl8"] Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.500028 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6v6fm-config-z6tl8"] Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.576130 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6v6fm-config-qgcf6"] Sep 30 17:20:05 crc kubenswrapper[4772]: E0930 17:20:05.576673 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346d6dc6-45fc-4534-848a-181c95a3c790" containerName="mariadb-account-create" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.576694 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="346d6dc6-45fc-4534-848a-181c95a3c790" containerName="mariadb-account-create" Sep 30 17:20:05 crc kubenswrapper[4772]: E0930 17:20:05.576708 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9ac0f7-6e58-4429-b287-9979084be549" containerName="ovn-config" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.576716 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9ac0f7-6e58-4429-b287-9979084be549" containerName="ovn-config" Sep 30 17:20:05 crc kubenswrapper[4772]: E0930 17:20:05.576819 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f847bcf-eda9-4d03-8b13-c7688bdeaf31" containerName="mariadb-account-create" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.576826 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f847bcf-eda9-4d03-8b13-c7688bdeaf31" containerName="mariadb-account-create" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.576988 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9ac0f7-6e58-4429-b287-9979084be549" containerName="ovn-config" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.577005 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="346d6dc6-45fc-4534-848a-181c95a3c790" containerName="mariadb-account-create" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.577011 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f847bcf-eda9-4d03-8b13-c7688bdeaf31" containerName="mariadb-account-create" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.577603 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.583263 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.586077 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6v6fm-config-qgcf6"] Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.686996 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run-ovn\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.687111 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-scripts\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.687167 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-log-ovn\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.687219 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.687237 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-additional-scripts\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.687255 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pslzg\" (UniqueName: \"kubernetes.io/projected/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-kube-api-access-pslzg\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.766418 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6v6fm" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.788224 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.788548 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-additional-scripts\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.788563 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.788582 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pslzg\" (UniqueName: \"kubernetes.io/projected/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-kube-api-access-pslzg\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.788674 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run-ovn\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.788716 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-scripts\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.788794 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-log-ovn\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.788943 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-log-ovn\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.789013 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run-ovn\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.789238 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-additional-scripts\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.791198 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-scripts\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.823145 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pslzg\" (UniqueName: \"kubernetes.io/projected/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-kube-api-access-pslzg\") pod \"ovn-controller-6v6fm-config-qgcf6\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.895224 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:05 crc kubenswrapper[4772]: I0930 17:20:05.918434 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d9ac0f7-6e58-4429-b287-9979084be549" path="/var/lib/kubelet/pods/7d9ac0f7-6e58-4429-b287-9979084be549/volumes" Sep 30 17:20:06 crc kubenswrapper[4772]: W0930 17:20:06.366386 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca84a615_c9dc_48f6_aa66_029bd11ccfaa.slice/crio-ff0a9d269dd0c372a3041fbaacb002ac09722abfc330107c858ce9222dbc07cd WatchSource:0}: Error finding container ff0a9d269dd0c372a3041fbaacb002ac09722abfc330107c858ce9222dbc07cd: Status 404 returned error can't find the container with id ff0a9d269dd0c372a3041fbaacb002ac09722abfc330107c858ce9222dbc07cd Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.372086 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6v6fm-config-qgcf6"] Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.414327 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pwm7r"] Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.420527 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.424299 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.424809 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sgs5q" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.456276 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pwm7r"] Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.600357 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-config-data\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.600415 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-combined-ca-bundle\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.600527 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-db-sync-config-data\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.600648 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzz9r\" (UniqueName: \"kubernetes.io/projected/523a44fe-7e63-47a7-9b9d-4e272994dce1-kube-api-access-qzz9r\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.702784 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-config-data\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.703141 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-combined-ca-bundle\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.703209 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-db-sync-config-data\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.703275 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzz9r\" (UniqueName: \"kubernetes.io/projected/523a44fe-7e63-47a7-9b9d-4e272994dce1-kube-api-access-qzz9r\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.710290 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-db-sync-config-data\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.723764 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-combined-ca-bundle\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.756739 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-config-data\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.763688 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzz9r\" (UniqueName: \"kubernetes.io/projected/523a44fe-7e63-47a7-9b9d-4e272994dce1-kube-api-access-qzz9r\") pod \"glance-db-sync-pwm7r\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.780650 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pwm7r" Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.961188 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6v6fm-config-qgcf6" event={"ID":"ca84a615-c9dc-48f6-aa66-029bd11ccfaa","Type":"ContainerStarted","Data":"4a8b1a3f5e9f1ea0ffcdbdf0e8eef9da02c973832222958146e309f7e5ab7471"} Sep 30 17:20:06 crc kubenswrapper[4772]: I0930 17:20:06.961521 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6v6fm-config-qgcf6" event={"ID":"ca84a615-c9dc-48f6-aa66-029bd11ccfaa","Type":"ContainerStarted","Data":"ff0a9d269dd0c372a3041fbaacb002ac09722abfc330107c858ce9222dbc07cd"} Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.207003 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="607217cf-8f90-4adb-bca7-0271ea8a7b9b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.258266 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6v6fm-config-qgcf6" podStartSLOduration=2.258243523 podStartE2EDuration="2.258243523s" podCreationTimestamp="2025-09-30 17:20:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:07.001398851 +0000 UTC m=+1107.908411682" watchObservedRunningTime="2025-09-30 17:20:07.258243523 +0000 UTC m=+1108.165256354" Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.263020 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pwm7r"] Sep 30 17:20:07 crc kubenswrapper[4772]: W0930 17:20:07.268315 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod523a44fe_7e63_47a7_9b9d_4e272994dce1.slice/crio-dca38f6bf43cfe0eb347b6387b76b9ce3833c24f822e264a638aec39082f8d05 WatchSource:0}: Error finding container dca38f6bf43cfe0eb347b6387b76b9ce3833c24f822e264a638aec39082f8d05: Status 404 returned error can't find the container with id dca38f6bf43cfe0eb347b6387b76b9ce3833c24f822e264a638aec39082f8d05 Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.549359 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.574676 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.574944 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="prometheus" containerID="cri-o://cbc3dae9d8b7c7fd7423aa2326587afe27a88c29c1d6e316093f3e5aba0c5c5a" gracePeriod=600 Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.575009 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="thanos-sidecar" containerID="cri-o://b86b85d7a96ffbf9e67a24c72cad1af56de311652111c2311216f7bf53f76f91" gracePeriod=600 Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.575029 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="config-reloader" containerID="cri-o://3a8bc55220d97adb7aaaa3f003c5f559b4015838c40cd82b1631dcd48c2c280e" gracePeriod=600 Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.926424 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.984948 4772 generic.go:334] "Generic (PLEG): container finished" podID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerID="b86b85d7a96ffbf9e67a24c72cad1af56de311652111c2311216f7bf53f76f91" exitCode=0 Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.985004 4772 generic.go:334] "Generic (PLEG): container finished" podID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerID="3a8bc55220d97adb7aaaa3f003c5f559b4015838c40cd82b1631dcd48c2c280e" exitCode=0 Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.985011 4772 generic.go:334] "Generic (PLEG): container finished" podID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerID="cbc3dae9d8b7c7fd7423aa2326587afe27a88c29c1d6e316093f3e5aba0c5c5a" exitCode=0 Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.985020 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3d947ffa-5613-4aae-b4a9-d42094fad0ae","Type":"ContainerDied","Data":"b86b85d7a96ffbf9e67a24c72cad1af56de311652111c2311216f7bf53f76f91"} Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.985082 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3d947ffa-5613-4aae-b4a9-d42094fad0ae","Type":"ContainerDied","Data":"3a8bc55220d97adb7aaaa3f003c5f559b4015838c40cd82b1631dcd48c2c280e"} Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.985096 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3d947ffa-5613-4aae-b4a9-d42094fad0ae","Type":"ContainerDied","Data":"cbc3dae9d8b7c7fd7423aa2326587afe27a88c29c1d6e316093f3e5aba0c5c5a"} Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.986580 4772 generic.go:334] "Generic (PLEG): container finished" podID="ca84a615-c9dc-48f6-aa66-029bd11ccfaa" containerID="4a8b1a3f5e9f1ea0ffcdbdf0e8eef9da02c973832222958146e309f7e5ab7471" exitCode=0 Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.986636 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6v6fm-config-qgcf6" event={"ID":"ca84a615-c9dc-48f6-aa66-029bd11ccfaa","Type":"ContainerDied","Data":"4a8b1a3f5e9f1ea0ffcdbdf0e8eef9da02c973832222958146e309f7e5ab7471"} Sep 30 17:20:07 crc kubenswrapper[4772]: I0930 17:20:07.987920 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pwm7r" event={"ID":"523a44fe-7e63-47a7-9b9d-4e272994dce1","Type":"ContainerStarted","Data":"dca38f6bf43cfe0eb347b6387b76b9ce3833c24f822e264a638aec39082f8d05"} Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.536597 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.636287 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-web-config\") pod \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.636427 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-thanos-prometheus-http-client-file\") pod \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.636476 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3d947ffa-5613-4aae-b4a9-d42094fad0ae-prometheus-metric-storage-rulefiles-0\") pod \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.636525 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config\") pod \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.636590 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-tls-assets\") pod \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.636646 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config-out\") pod \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.636665 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9rdk\" (UniqueName: \"kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-kube-api-access-s9rdk\") pod \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.636787 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\" (UID: \"3d947ffa-5613-4aae-b4a9-d42094fad0ae\") " Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.638362 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d947ffa-5613-4aae-b4a9-d42094fad0ae-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "3d947ffa-5613-4aae-b4a9-d42094fad0ae" (UID: "3d947ffa-5613-4aae-b4a9-d42094fad0ae"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.643132 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config-out" (OuterVolumeSpecName: "config-out") pod "3d947ffa-5613-4aae-b4a9-d42094fad0ae" (UID: "3d947ffa-5613-4aae-b4a9-d42094fad0ae"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.643505 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-kube-api-access-s9rdk" (OuterVolumeSpecName: "kube-api-access-s9rdk") pod "3d947ffa-5613-4aae-b4a9-d42094fad0ae" (UID: "3d947ffa-5613-4aae-b4a9-d42094fad0ae"). InnerVolumeSpecName "kube-api-access-s9rdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.643776 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config" (OuterVolumeSpecName: "config") pod "3d947ffa-5613-4aae-b4a9-d42094fad0ae" (UID: "3d947ffa-5613-4aae-b4a9-d42094fad0ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.643915 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "3d947ffa-5613-4aae-b4a9-d42094fad0ae" (UID: "3d947ffa-5613-4aae-b4a9-d42094fad0ae"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.647675 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "3d947ffa-5613-4aae-b4a9-d42094fad0ae" (UID: "3d947ffa-5613-4aae-b4a9-d42094fad0ae"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.655210 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "3d947ffa-5613-4aae-b4a9-d42094fad0ae" (UID: "3d947ffa-5613-4aae-b4a9-d42094fad0ae"). InnerVolumeSpecName "pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.667624 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-web-config" (OuterVolumeSpecName: "web-config") pod "3d947ffa-5613-4aae-b4a9-d42094fad0ae" (UID: "3d947ffa-5613-4aae-b4a9-d42094fad0ae"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.739250 4772 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.739287 4772 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.739299 4772 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3d947ffa-5613-4aae-b4a9-d42094fad0ae-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.739309 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.739322 4772 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.739333 4772 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3d947ffa-5613-4aae-b4a9-d42094fad0ae-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.739344 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9rdk\" (UniqueName: \"kubernetes.io/projected/3d947ffa-5613-4aae-b4a9-d42094fad0ae-kube-api-access-s9rdk\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.739387 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") on node \"crc\" " Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.758539 4772 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.758687 4772 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55") on node "crc" Sep 30 17:20:08 crc kubenswrapper[4772]: I0930 17:20:08.840835 4772 reconciler_common.go:293] "Volume detached for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:08.999940 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.001252 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3d947ffa-5613-4aae-b4a9-d42094fad0ae","Type":"ContainerDied","Data":"0298d8ba26cdd55502c9bd6622adb0ee0b52c7f255c4f2a6017ca15bab0ec9aa"} Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.001323 4772 scope.go:117] "RemoveContainer" containerID="b86b85d7a96ffbf9e67a24c72cad1af56de311652111c2311216f7bf53f76f91" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.026976 4772 scope.go:117] "RemoveContainer" containerID="3a8bc55220d97adb7aaaa3f003c5f559b4015838c40cd82b1631dcd48c2c280e" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.040460 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.049209 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.058967 4772 scope.go:117] "RemoveContainer" containerID="cbc3dae9d8b7c7fd7423aa2326587afe27a88c29c1d6e316093f3e5aba0c5c5a" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.071724 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:20:09 crc kubenswrapper[4772]: E0930 17:20:09.072195 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="init-config-reloader" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.072218 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="init-config-reloader" Sep 30 17:20:09 crc kubenswrapper[4772]: E0930 17:20:09.072235 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="thanos-sidecar" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.072243 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="thanos-sidecar" Sep 30 17:20:09 crc kubenswrapper[4772]: E0930 17:20:09.072264 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="prometheus" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.072272 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="prometheus" Sep 30 17:20:09 crc kubenswrapper[4772]: E0930 17:20:09.072287 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="config-reloader" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.072295 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="config-reloader" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.072485 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="thanos-sidecar" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.072511 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="config-reloader" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.072529 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" containerName="prometheus" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.074326 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.079598 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.079788 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-phjm5" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.080144 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.080973 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.087136 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.088783 4772 scope.go:117] "RemoveContainer" containerID="880847e146733e46ee15523008c4cce1978586e9b6c0a83798cbe8f338b9a176" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.090381 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.091334 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.104498 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.248825 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.248890 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhpxf\" (UniqueName: \"kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-kube-api-access-zhpxf\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.248925 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.248971 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.249091 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.249307 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.249387 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.249783 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.249832 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.249864 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.249949 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351042 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351400 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351445 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351484 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhpxf\" (UniqueName: \"kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-kube-api-access-zhpxf\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351516 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351561 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351593 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351668 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351712 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351746 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.351784 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.352702 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.356568 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.357165 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.357464 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.357528 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a4cd7d25308c8c5d6d110405c655d59b160fe777a0b1c5faa198b785c403f1cc/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.358839 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.360150 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.360810 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.361365 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.361646 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.363107 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.365633 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.391418 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhpxf\" (UniqueName: \"kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-kube-api-access-zhpxf\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.418351 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.455842 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-scripts\") pod \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.455943 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pslzg\" (UniqueName: \"kubernetes.io/projected/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-kube-api-access-pslzg\") pod \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.456223 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run-ovn\") pod \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.456269 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run\") pod \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.456292 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-log-ovn\") pod \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.456331 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-additional-scripts\") pod \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\" (UID: \"ca84a615-c9dc-48f6-aa66-029bd11ccfaa\") " Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.456942 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ca84a615-c9dc-48f6-aa66-029bd11ccfaa" (UID: "ca84a615-c9dc-48f6-aa66-029bd11ccfaa"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.457132 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-scripts" (OuterVolumeSpecName: "scripts") pod "ca84a615-c9dc-48f6-aa66-029bd11ccfaa" (UID: "ca84a615-c9dc-48f6-aa66-029bd11ccfaa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.457165 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ca84a615-c9dc-48f6-aa66-029bd11ccfaa" (UID: "ca84a615-c9dc-48f6-aa66-029bd11ccfaa"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.457205 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run" (OuterVolumeSpecName: "var-run") pod "ca84a615-c9dc-48f6-aa66-029bd11ccfaa" (UID: "ca84a615-c9dc-48f6-aa66-029bd11ccfaa"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.457735 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ca84a615-c9dc-48f6-aa66-029bd11ccfaa" (UID: "ca84a615-c9dc-48f6-aa66-029bd11ccfaa"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.461651 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-kube-api-access-pslzg" (OuterVolumeSpecName: "kube-api-access-pslzg") pod "ca84a615-c9dc-48f6-aa66-029bd11ccfaa" (UID: "ca84a615-c9dc-48f6-aa66-029bd11ccfaa"). InnerVolumeSpecName "kube-api-access-pslzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.558683 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pslzg\" (UniqueName: \"kubernetes.io/projected/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-kube-api-access-pslzg\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.558725 4772 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.558740 4772 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.558752 4772 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.558763 4772 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.558776 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca84a615-c9dc-48f6-aa66-029bd11ccfaa-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.690891 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:09 crc kubenswrapper[4772]: I0930 17:20:09.940032 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d947ffa-5613-4aae-b4a9-d42094fad0ae" path="/var/lib/kubelet/pods/3d947ffa-5613-4aae-b4a9-d42094fad0ae/volumes" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.012176 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6v6fm-config-qgcf6" event={"ID":"ca84a615-c9dc-48f6-aa66-029bd11ccfaa","Type":"ContainerDied","Data":"ff0a9d269dd0c372a3041fbaacb002ac09722abfc330107c858ce9222dbc07cd"} Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.012219 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff0a9d269dd0c372a3041fbaacb002ac09722abfc330107c858ce9222dbc07cd" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.012247 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6v6fm-config-qgcf6" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.060200 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6v6fm-config-qgcf6"] Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.066164 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6v6fm-config-qgcf6"] Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.121782 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.620590 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bad6-account-create-ns8k7"] Sep 30 17:20:10 crc kubenswrapper[4772]: E0930 17:20:10.621299 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca84a615-c9dc-48f6-aa66-029bd11ccfaa" containerName="ovn-config" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.621320 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca84a615-c9dc-48f6-aa66-029bd11ccfaa" containerName="ovn-config" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.621540 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca84a615-c9dc-48f6-aa66-029bd11ccfaa" containerName="ovn-config" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.622101 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bad6-account-create-ns8k7" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.624100 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.631605 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bad6-account-create-ns8k7"] Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.782528 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thd6q\" (UniqueName: \"kubernetes.io/projected/aabbe70a-9741-4424-ae33-ed237e64a54b-kube-api-access-thd6q\") pod \"keystone-bad6-account-create-ns8k7\" (UID: \"aabbe70a-9741-4424-ae33-ed237e64a54b\") " pod="openstack/keystone-bad6-account-create-ns8k7" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.883943 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thd6q\" (UniqueName: \"kubernetes.io/projected/aabbe70a-9741-4424-ae33-ed237e64a54b-kube-api-access-thd6q\") pod \"keystone-bad6-account-create-ns8k7\" (UID: \"aabbe70a-9741-4424-ae33-ed237e64a54b\") " pod="openstack/keystone-bad6-account-create-ns8k7" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.916068 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thd6q\" (UniqueName: \"kubernetes.io/projected/aabbe70a-9741-4424-ae33-ed237e64a54b-kube-api-access-thd6q\") pod \"keystone-bad6-account-create-ns8k7\" (UID: \"aabbe70a-9741-4424-ae33-ed237e64a54b\") " pod="openstack/keystone-bad6-account-create-ns8k7" Sep 30 17:20:10 crc kubenswrapper[4772]: I0930 17:20:10.942113 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bad6-account-create-ns8k7" Sep 30 17:20:11 crc kubenswrapper[4772]: I0930 17:20:11.023005 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b","Type":"ContainerStarted","Data":"5ee4cfeda89ce98d4146d01afc038d51fa93ce363bc9c7872d4b719b05da6a61"} Sep 30 17:20:11 crc kubenswrapper[4772]: I0930 17:20:11.345446 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bad6-account-create-ns8k7"] Sep 30 17:20:11 crc kubenswrapper[4772]: W0930 17:20:11.370768 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaabbe70a_9741_4424_ae33_ed237e64a54b.slice/crio-fd5dd96cfd247334cb88465a42ba7aecff221b9e07c68791ee8084666080f843 WatchSource:0}: Error finding container fd5dd96cfd247334cb88465a42ba7aecff221b9e07c68791ee8084666080f843: Status 404 returned error can't find the container with id fd5dd96cfd247334cb88465a42ba7aecff221b9e07c68791ee8084666080f843 Sep 30 17:20:11 crc kubenswrapper[4772]: I0930 17:20:11.912801 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca84a615-c9dc-48f6-aa66-029bd11ccfaa" path="/var/lib/kubelet/pods/ca84a615-c9dc-48f6-aa66-029bd11ccfaa/volumes" Sep 30 17:20:12 crc kubenswrapper[4772]: I0930 17:20:12.034928 4772 generic.go:334] "Generic (PLEG): container finished" podID="aabbe70a-9741-4424-ae33-ed237e64a54b" containerID="d4a1824ba74425c91844929323df348b7f9ff9219801ae6d6ff31d0f83f64888" exitCode=0 Sep 30 17:20:12 crc kubenswrapper[4772]: I0930 17:20:12.035003 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bad6-account-create-ns8k7" event={"ID":"aabbe70a-9741-4424-ae33-ed237e64a54b","Type":"ContainerDied","Data":"d4a1824ba74425c91844929323df348b7f9ff9219801ae6d6ff31d0f83f64888"} Sep 30 17:20:12 crc kubenswrapper[4772]: I0930 17:20:12.035036 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bad6-account-create-ns8k7" event={"ID":"aabbe70a-9741-4424-ae33-ed237e64a54b","Type":"ContainerStarted","Data":"fd5dd96cfd247334cb88465a42ba7aecff221b9e07c68791ee8084666080f843"} Sep 30 17:20:12 crc kubenswrapper[4772]: I0930 17:20:12.784275 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-3d9f-account-create-84lzl"] Sep 30 17:20:12 crc kubenswrapper[4772]: I0930 17:20:12.786568 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-3d9f-account-create-84lzl" Sep 30 17:20:12 crc kubenswrapper[4772]: I0930 17:20:12.788601 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Sep 30 17:20:12 crc kubenswrapper[4772]: I0930 17:20:12.798424 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-3d9f-account-create-84lzl"] Sep 30 17:20:12 crc kubenswrapper[4772]: I0930 17:20:12.924601 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bgkd\" (UniqueName: \"kubernetes.io/projected/143dbcfd-e0db-419c-aa5e-65ad9174de1e-kube-api-access-2bgkd\") pod \"watcher-3d9f-account-create-84lzl\" (UID: \"143dbcfd-e0db-419c-aa5e-65ad9174de1e\") " pod="openstack/watcher-3d9f-account-create-84lzl" Sep 30 17:20:13 crc kubenswrapper[4772]: I0930 17:20:13.026199 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bgkd\" (UniqueName: \"kubernetes.io/projected/143dbcfd-e0db-419c-aa5e-65ad9174de1e-kube-api-access-2bgkd\") pod \"watcher-3d9f-account-create-84lzl\" (UID: \"143dbcfd-e0db-419c-aa5e-65ad9174de1e\") " pod="openstack/watcher-3d9f-account-create-84lzl" Sep 30 17:20:13 crc kubenswrapper[4772]: I0930 17:20:13.056784 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bgkd\" (UniqueName: \"kubernetes.io/projected/143dbcfd-e0db-419c-aa5e-65ad9174de1e-kube-api-access-2bgkd\") pod \"watcher-3d9f-account-create-84lzl\" (UID: \"143dbcfd-e0db-419c-aa5e-65ad9174de1e\") " pod="openstack/watcher-3d9f-account-create-84lzl" Sep 30 17:20:13 crc kubenswrapper[4772]: I0930 17:20:13.109184 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-3d9f-account-create-84lzl" Sep 30 17:20:13 crc kubenswrapper[4772]: I0930 17:20:13.423061 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bad6-account-create-ns8k7" Sep 30 17:20:13 crc kubenswrapper[4772]: I0930 17:20:13.533261 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thd6q\" (UniqueName: \"kubernetes.io/projected/aabbe70a-9741-4424-ae33-ed237e64a54b-kube-api-access-thd6q\") pod \"aabbe70a-9741-4424-ae33-ed237e64a54b\" (UID: \"aabbe70a-9741-4424-ae33-ed237e64a54b\") " Sep 30 17:20:13 crc kubenswrapper[4772]: I0930 17:20:13.540821 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aabbe70a-9741-4424-ae33-ed237e64a54b-kube-api-access-thd6q" (OuterVolumeSpecName: "kube-api-access-thd6q") pod "aabbe70a-9741-4424-ae33-ed237e64a54b" (UID: "aabbe70a-9741-4424-ae33-ed237e64a54b"). InnerVolumeSpecName "kube-api-access-thd6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:13 crc kubenswrapper[4772]: I0930 17:20:13.600532 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-3d9f-account-create-84lzl"] Sep 30 17:20:13 crc kubenswrapper[4772]: W0930 17:20:13.605216 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod143dbcfd_e0db_419c_aa5e_65ad9174de1e.slice/crio-05969fe31ca7c3ae6e5e8f0951b99d430ea93ac77d10c3bd31a34760d7595420 WatchSource:0}: Error finding container 05969fe31ca7c3ae6e5e8f0951b99d430ea93ac77d10c3bd31a34760d7595420: Status 404 returned error can't find the container with id 05969fe31ca7c3ae6e5e8f0951b99d430ea93ac77d10c3bd31a34760d7595420 Sep 30 17:20:13 crc kubenswrapper[4772]: I0930 17:20:13.636004 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thd6q\" (UniqueName: \"kubernetes.io/projected/aabbe70a-9741-4424-ae33-ed237e64a54b-kube-api-access-thd6q\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:14 crc kubenswrapper[4772]: I0930 17:20:14.064660 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bad6-account-create-ns8k7" event={"ID":"aabbe70a-9741-4424-ae33-ed237e64a54b","Type":"ContainerDied","Data":"fd5dd96cfd247334cb88465a42ba7aecff221b9e07c68791ee8084666080f843"} Sep 30 17:20:14 crc kubenswrapper[4772]: I0930 17:20:14.065001 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd5dd96cfd247334cb88465a42ba7aecff221b9e07c68791ee8084666080f843" Sep 30 17:20:14 crc kubenswrapper[4772]: I0930 17:20:14.064701 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bad6-account-create-ns8k7" Sep 30 17:20:14 crc kubenswrapper[4772]: I0930 17:20:14.066942 4772 generic.go:334] "Generic (PLEG): container finished" podID="143dbcfd-e0db-419c-aa5e-65ad9174de1e" containerID="8c859f941a1598f3fc91c4830640df9fcbc78c81830b9b5e7f3ac0bba7f846f1" exitCode=0 Sep 30 17:20:14 crc kubenswrapper[4772]: I0930 17:20:14.067023 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-3d9f-account-create-84lzl" event={"ID":"143dbcfd-e0db-419c-aa5e-65ad9174de1e","Type":"ContainerDied","Data":"8c859f941a1598f3fc91c4830640df9fcbc78c81830b9b5e7f3ac0bba7f846f1"} Sep 30 17:20:14 crc kubenswrapper[4772]: I0930 17:20:14.067068 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-3d9f-account-create-84lzl" event={"ID":"143dbcfd-e0db-419c-aa5e-65ad9174de1e","Type":"ContainerStarted","Data":"05969fe31ca7c3ae6e5e8f0951b99d430ea93ac77d10c3bd31a34760d7595420"} Sep 30 17:20:14 crc kubenswrapper[4772]: I0930 17:20:14.068802 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b","Type":"ContainerStarted","Data":"2f911176d380f31af114254402121761cf351d24a2ee740826e7f2c17aaf9f09"} Sep 30 17:20:17 crc kubenswrapper[4772]: I0930 17:20:17.210282 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Sep 30 17:20:17 crc kubenswrapper[4772]: I0930 17:20:17.551258 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 17:20:17 crc kubenswrapper[4772]: I0930 17:20:17.933354 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.711976 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8jz82"] Sep 30 17:20:18 crc kubenswrapper[4772]: E0930 17:20:18.713281 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabbe70a-9741-4424-ae33-ed237e64a54b" containerName="mariadb-account-create" Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.713389 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabbe70a-9741-4424-ae33-ed237e64a54b" containerName="mariadb-account-create" Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.713667 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabbe70a-9741-4424-ae33-ed237e64a54b" containerName="mariadb-account-create" Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.714406 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8jz82" Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.740735 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8jz82"] Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.826872 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d8qj\" (UniqueName: \"kubernetes.io/projected/1ffc8f04-1cf1-4653-925c-781c84770099-kube-api-access-6d8qj\") pod \"cinder-db-create-8jz82\" (UID: \"1ffc8f04-1cf1-4653-925c-781c84770099\") " pod="openstack/cinder-db-create-8jz82" Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.898671 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bpsgs"] Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.900175 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bpsgs" Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.916578 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bpsgs"] Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.948987 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d8qj\" (UniqueName: \"kubernetes.io/projected/1ffc8f04-1cf1-4653-925c-781c84770099-kube-api-access-6d8qj\") pod \"cinder-db-create-8jz82\" (UID: \"1ffc8f04-1cf1-4653-925c-781c84770099\") " pod="openstack/cinder-db-create-8jz82" Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.979415 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d8qj\" (UniqueName: \"kubernetes.io/projected/1ffc8f04-1cf1-4653-925c-781c84770099-kube-api-access-6d8qj\") pod \"cinder-db-create-8jz82\" (UID: \"1ffc8f04-1cf1-4653-925c-781c84770099\") " pod="openstack/cinder-db-create-8jz82" Sep 30 17:20:18 crc kubenswrapper[4772]: I0930 17:20:18.998496 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-76fwc"] Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.000627 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-76fwc" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.009662 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-76fwc"] Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.034505 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8jz82" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.058810 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8xfm\" (UniqueName: \"kubernetes.io/projected/a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44-kube-api-access-d8xfm\") pod \"neutron-db-create-76fwc\" (UID: \"a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44\") " pod="openstack/neutron-db-create-76fwc" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.059046 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qxwc\" (UniqueName: \"kubernetes.io/projected/d32d405b-efef-42de-a2c0-7c047dbcbec3-kube-api-access-6qxwc\") pod \"barbican-db-create-bpsgs\" (UID: \"d32d405b-efef-42de-a2c0-7c047dbcbec3\") " pod="openstack/barbican-db-create-bpsgs" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.064662 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-tmd5b"] Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.065721 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.069160 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fsmbb" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.069532 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.069695 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.069842 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.072577 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tmd5b"] Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.160406 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwcnw\" (UniqueName: \"kubernetes.io/projected/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-kube-api-access-kwcnw\") pod \"keystone-db-sync-tmd5b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.160715 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qxwc\" (UniqueName: \"kubernetes.io/projected/d32d405b-efef-42de-a2c0-7c047dbcbec3-kube-api-access-6qxwc\") pod \"barbican-db-create-bpsgs\" (UID: \"d32d405b-efef-42de-a2c0-7c047dbcbec3\") " pod="openstack/barbican-db-create-bpsgs" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.160858 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-config-data\") pod \"keystone-db-sync-tmd5b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.160988 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8xfm\" (UniqueName: \"kubernetes.io/projected/a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44-kube-api-access-d8xfm\") pod \"neutron-db-create-76fwc\" (UID: \"a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44\") " pod="openstack/neutron-db-create-76fwc" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.161141 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-combined-ca-bundle\") pod \"keystone-db-sync-tmd5b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.180514 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qxwc\" (UniqueName: \"kubernetes.io/projected/d32d405b-efef-42de-a2c0-7c047dbcbec3-kube-api-access-6qxwc\") pod \"barbican-db-create-bpsgs\" (UID: \"d32d405b-efef-42de-a2c0-7c047dbcbec3\") " pod="openstack/barbican-db-create-bpsgs" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.182927 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8xfm\" (UniqueName: \"kubernetes.io/projected/a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44-kube-api-access-d8xfm\") pod \"neutron-db-create-76fwc\" (UID: \"a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44\") " pod="openstack/neutron-db-create-76fwc" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.253904 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bpsgs" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.262910 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-config-data\") pod \"keystone-db-sync-tmd5b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.263043 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-combined-ca-bundle\") pod \"keystone-db-sync-tmd5b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.263137 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwcnw\" (UniqueName: \"kubernetes.io/projected/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-kube-api-access-kwcnw\") pod \"keystone-db-sync-tmd5b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.267010 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-combined-ca-bundle\") pod \"keystone-db-sync-tmd5b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.269156 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-config-data\") pod \"keystone-db-sync-tmd5b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.283642 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwcnw\" (UniqueName: \"kubernetes.io/projected/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-kube-api-access-kwcnw\") pod \"keystone-db-sync-tmd5b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.338288 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-76fwc" Sep 30 17:20:19 crc kubenswrapper[4772]: I0930 17:20:19.386496 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:22 crc kubenswrapper[4772]: I0930 17:20:22.148163 4772 generic.go:334] "Generic (PLEG): container finished" podID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerID="2f911176d380f31af114254402121761cf351d24a2ee740826e7f2c17aaf9f09" exitCode=0 Sep 30 17:20:22 crc kubenswrapper[4772]: I0930 17:20:22.148246 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b","Type":"ContainerDied","Data":"2f911176d380f31af114254402121761cf351d24a2ee740826e7f2c17aaf9f09"} Sep 30 17:20:23 crc kubenswrapper[4772]: E0930 17:20:23.499536 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Sep 30 17:20:23 crc kubenswrapper[4772]: E0930 17:20:23.499859 4772 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Sep 30 17:20:23 crc kubenswrapper[4772]: E0930 17:20:23.499976 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.129.56.221:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qzz9r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-pwm7r_openstack(523a44fe-7e63-47a7-9b9d-4e272994dce1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:20:23 crc kubenswrapper[4772]: E0930 17:20:23.501136 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-pwm7r" podUID="523a44fe-7e63-47a7-9b9d-4e272994dce1" Sep 30 17:20:23 crc kubenswrapper[4772]: I0930 17:20:23.651143 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-3d9f-account-create-84lzl" Sep 30 17:20:23 crc kubenswrapper[4772]: I0930 17:20:23.758867 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bgkd\" (UniqueName: \"kubernetes.io/projected/143dbcfd-e0db-419c-aa5e-65ad9174de1e-kube-api-access-2bgkd\") pod \"143dbcfd-e0db-419c-aa5e-65ad9174de1e\" (UID: \"143dbcfd-e0db-419c-aa5e-65ad9174de1e\") " Sep 30 17:20:23 crc kubenswrapper[4772]: I0930 17:20:23.767573 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143dbcfd-e0db-419c-aa5e-65ad9174de1e-kube-api-access-2bgkd" (OuterVolumeSpecName: "kube-api-access-2bgkd") pod "143dbcfd-e0db-419c-aa5e-65ad9174de1e" (UID: "143dbcfd-e0db-419c-aa5e-65ad9174de1e"). InnerVolumeSpecName "kube-api-access-2bgkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:23 crc kubenswrapper[4772]: I0930 17:20:23.861786 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bgkd\" (UniqueName: \"kubernetes.io/projected/143dbcfd-e0db-419c-aa5e-65ad9174de1e-kube-api-access-2bgkd\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:23 crc kubenswrapper[4772]: W0930 17:20:23.959820 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8344d58c_29c7_40c3_81bb_5fc3ad4ea02b.slice/crio-718a4c75b3f73c9e8b307f102f3837fb64ab337137dae12ed7de4a93a20eb393 WatchSource:0}: Error finding container 718a4c75b3f73c9e8b307f102f3837fb64ab337137dae12ed7de4a93a20eb393: Status 404 returned error can't find the container with id 718a4c75b3f73c9e8b307f102f3837fb64ab337137dae12ed7de4a93a20eb393 Sep 30 17:20:23 crc kubenswrapper[4772]: I0930 17:20:23.960636 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tmd5b"] Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.056327 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bpsgs"] Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.062598 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8jz82"] Sep 30 17:20:24 crc kubenswrapper[4772]: W0930 17:20:24.064839 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ffc8f04_1cf1_4653_925c_781c84770099.slice/crio-633ab88a34db6924dd44ad5fc08af605fb39fe98619bf2d0ef68958828a519d3 WatchSource:0}: Error finding container 633ab88a34db6924dd44ad5fc08af605fb39fe98619bf2d0ef68958828a519d3: Status 404 returned error can't find the container with id 633ab88a34db6924dd44ad5fc08af605fb39fe98619bf2d0ef68958828a519d3 Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.069434 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-76fwc"] Sep 30 17:20:24 crc kubenswrapper[4772]: W0930 17:20:24.071256 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6cb11fd_200b_44d7_a29f_f9ebbdbc2d44.slice/crio-96c272a956e6a6c8962e5f593d36aa9f898986f47c3ccd4abd97b0afeb621398 WatchSource:0}: Error finding container 96c272a956e6a6c8962e5f593d36aa9f898986f47c3ccd4abd97b0afeb621398: Status 404 returned error can't find the container with id 96c272a956e6a6c8962e5f593d36aa9f898986f47c3ccd4abd97b0afeb621398 Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.183220 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8jz82" event={"ID":"1ffc8f04-1cf1-4653-925c-781c84770099","Type":"ContainerStarted","Data":"633ab88a34db6924dd44ad5fc08af605fb39fe98619bf2d0ef68958828a519d3"} Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.186041 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b","Type":"ContainerStarted","Data":"fa8fcb56ba464764db6ffb6fc836f13e17f54d29518d27ea98e40538d0cc74b2"} Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.188089 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bpsgs" event={"ID":"d32d405b-efef-42de-a2c0-7c047dbcbec3","Type":"ContainerStarted","Data":"9488d433b5fda167ba70332ee9709707421c443d28979f0fc4f8e8ed97918857"} Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.189453 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tmd5b" event={"ID":"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b","Type":"ContainerStarted","Data":"718a4c75b3f73c9e8b307f102f3837fb64ab337137dae12ed7de4a93a20eb393"} Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.190474 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-76fwc" event={"ID":"a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44","Type":"ContainerStarted","Data":"96c272a956e6a6c8962e5f593d36aa9f898986f47c3ccd4abd97b0afeb621398"} Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.191948 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-3d9f-account-create-84lzl" Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.191953 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-3d9f-account-create-84lzl" event={"ID":"143dbcfd-e0db-419c-aa5e-65ad9174de1e","Type":"ContainerDied","Data":"05969fe31ca7c3ae6e5e8f0951b99d430ea93ac77d10c3bd31a34760d7595420"} Sep 30 17:20:24 crc kubenswrapper[4772]: I0930 17:20:24.192008 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05969fe31ca7c3ae6e5e8f0951b99d430ea93ac77d10c3bd31a34760d7595420" Sep 30 17:20:24 crc kubenswrapper[4772]: E0930 17:20:24.195987 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.221:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-pwm7r" podUID="523a44fe-7e63-47a7-9b9d-4e272994dce1" Sep 30 17:20:25 crc kubenswrapper[4772]: I0930 17:20:25.203804 4772 generic.go:334] "Generic (PLEG): container finished" podID="a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44" containerID="85ffc3c53bfbef4f015e4511158ceb4aecc00e91d8d6288ff39c66ccab4fa0cb" exitCode=0 Sep 30 17:20:25 crc kubenswrapper[4772]: I0930 17:20:25.204096 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-76fwc" event={"ID":"a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44","Type":"ContainerDied","Data":"85ffc3c53bfbef4f015e4511158ceb4aecc00e91d8d6288ff39c66ccab4fa0cb"} Sep 30 17:20:25 crc kubenswrapper[4772]: I0930 17:20:25.205701 4772 generic.go:334] "Generic (PLEG): container finished" podID="1ffc8f04-1cf1-4653-925c-781c84770099" containerID="91c2f294268261e241aac7972a7c3e2bf679dcd6facbec1710878bddc0a11327" exitCode=0 Sep 30 17:20:25 crc kubenswrapper[4772]: I0930 17:20:25.205742 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8jz82" event={"ID":"1ffc8f04-1cf1-4653-925c-781c84770099","Type":"ContainerDied","Data":"91c2f294268261e241aac7972a7c3e2bf679dcd6facbec1710878bddc0a11327"} Sep 30 17:20:25 crc kubenswrapper[4772]: I0930 17:20:25.208532 4772 generic.go:334] "Generic (PLEG): container finished" podID="d32d405b-efef-42de-a2c0-7c047dbcbec3" containerID="b2c401b202da2439070bae59cba89ede435518cde20bf7c02161c203f3b055d0" exitCode=0 Sep 30 17:20:25 crc kubenswrapper[4772]: I0930 17:20:25.208565 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bpsgs" event={"ID":"d32d405b-efef-42de-a2c0-7c047dbcbec3","Type":"ContainerDied","Data":"b2c401b202da2439070bae59cba89ede435518cde20bf7c02161c203f3b055d0"} Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.238694 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b","Type":"ContainerStarted","Data":"16e58bdbd634a993e4df41c064b6189911fd179943f9d3088eef4d2150d5f963"} Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.243091 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-pf2kh"] Sep 30 17:20:32 crc kubenswrapper[4772]: E0930 17:20:28.243516 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143dbcfd-e0db-419c-aa5e-65ad9174de1e" containerName="mariadb-account-create" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.243532 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="143dbcfd-e0db-419c-aa5e-65ad9174de1e" containerName="mariadb-account-create" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.243740 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="143dbcfd-e0db-419c-aa5e-65ad9174de1e" containerName="mariadb-account-create" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.244481 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.247656 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-5kwnw" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.257782 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.258757 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-pf2kh"] Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.339013 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-db-sync-config-data\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.339092 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-combined-ca-bundle\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.339406 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdrc2\" (UniqueName: \"kubernetes.io/projected/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-kube-api-access-jdrc2\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.339461 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-config-data\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.441525 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-config-data\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.441612 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-db-sync-config-data\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.441651 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-combined-ca-bundle\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.441726 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdrc2\" (UniqueName: \"kubernetes.io/projected/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-kube-api-access-jdrc2\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.453772 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-db-sync-config-data\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.453930 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-combined-ca-bundle\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.453990 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-config-data\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.461763 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdrc2\" (UniqueName: \"kubernetes.io/projected/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-kube-api-access-jdrc2\") pod \"watcher-db-sync-pf2kh\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:28.569965 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:31.807686 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-76fwc" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:31.900669 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8xfm\" (UniqueName: \"kubernetes.io/projected/a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44-kube-api-access-d8xfm\") pod \"a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44\" (UID: \"a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44\") " Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:31.907212 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44-kube-api-access-d8xfm" (OuterVolumeSpecName: "kube-api-access-d8xfm") pod "a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44" (UID: "a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44"). InnerVolumeSpecName "kube-api-access-d8xfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.002415 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8xfm\" (UniqueName: \"kubernetes.io/projected/a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44-kube-api-access-d8xfm\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.275909 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-76fwc" event={"ID":"a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44","Type":"ContainerDied","Data":"96c272a956e6a6c8962e5f593d36aa9f898986f47c3ccd4abd97b0afeb621398"} Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.275944 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96c272a956e6a6c8962e5f593d36aa9f898986f47c3ccd4abd97b0afeb621398" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.276011 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-76fwc" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.280375 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b","Type":"ContainerStarted","Data":"ee97b6f54d1fdcfcf030810826727b7473e4082a10284499b1dc90a1ea2af6a8"} Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.290701 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tmd5b" event={"ID":"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b","Type":"ContainerStarted","Data":"3b7fa39dfb0ec84ac5f5eee196c7d64ca902556cb1cc336ffff2d5fd35a6ff10"} Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.392409 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=23.392384949 podStartE2EDuration="23.392384949s" podCreationTimestamp="2025-09-30 17:20:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:32.368475708 +0000 UTC m=+1133.275488539" watchObservedRunningTime="2025-09-30 17:20:32.392384949 +0000 UTC m=+1133.299397780" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.403392 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-tmd5b" podStartSLOduration=5.688049423 podStartE2EDuration="13.40336767s" podCreationTimestamp="2025-09-30 17:20:19 +0000 UTC" firstStartedPulling="2025-09-30 17:20:23.962314488 +0000 UTC m=+1124.869327319" lastFinishedPulling="2025-09-30 17:20:31.677632735 +0000 UTC m=+1132.584645566" observedRunningTime="2025-09-30 17:20:32.387443883 +0000 UTC m=+1133.294456724" watchObservedRunningTime="2025-09-30 17:20:32.40336767 +0000 UTC m=+1133.310380501" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.412752 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bpsgs" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.416105 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8jz82" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.444990 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-pf2kh"] Sep 30 17:20:32 crc kubenswrapper[4772]: W0930 17:20:32.445524 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e98d64c_5c8f_4fcb_a5a9_315cd49ab5d7.slice/crio-f0109b21e2c24c5f94396b235394c0879720e01be5d4485d3d53135bfed18c43 WatchSource:0}: Error finding container f0109b21e2c24c5f94396b235394c0879720e01be5d4485d3d53135bfed18c43: Status 404 returned error can't find the container with id f0109b21e2c24c5f94396b235394c0879720e01be5d4485d3d53135bfed18c43 Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.517307 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qxwc\" (UniqueName: \"kubernetes.io/projected/d32d405b-efef-42de-a2c0-7c047dbcbec3-kube-api-access-6qxwc\") pod \"d32d405b-efef-42de-a2c0-7c047dbcbec3\" (UID: \"d32d405b-efef-42de-a2c0-7c047dbcbec3\") " Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.517448 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d8qj\" (UniqueName: \"kubernetes.io/projected/1ffc8f04-1cf1-4653-925c-781c84770099-kube-api-access-6d8qj\") pod \"1ffc8f04-1cf1-4653-925c-781c84770099\" (UID: \"1ffc8f04-1cf1-4653-925c-781c84770099\") " Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.523186 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ffc8f04-1cf1-4653-925c-781c84770099-kube-api-access-6d8qj" (OuterVolumeSpecName: "kube-api-access-6d8qj") pod "1ffc8f04-1cf1-4653-925c-781c84770099" (UID: "1ffc8f04-1cf1-4653-925c-781c84770099"). InnerVolumeSpecName "kube-api-access-6d8qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.526671 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d32d405b-efef-42de-a2c0-7c047dbcbec3-kube-api-access-6qxwc" (OuterVolumeSpecName: "kube-api-access-6qxwc") pod "d32d405b-efef-42de-a2c0-7c047dbcbec3" (UID: "d32d405b-efef-42de-a2c0-7c047dbcbec3"). InnerVolumeSpecName "kube-api-access-6qxwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.620680 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d8qj\" (UniqueName: \"kubernetes.io/projected/1ffc8f04-1cf1-4653-925c-781c84770099-kube-api-access-6d8qj\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:32 crc kubenswrapper[4772]: I0930 17:20:32.620724 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qxwc\" (UniqueName: \"kubernetes.io/projected/d32d405b-efef-42de-a2c0-7c047dbcbec3-kube-api-access-6qxwc\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:33 crc kubenswrapper[4772]: I0930 17:20:33.299728 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-pf2kh" event={"ID":"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7","Type":"ContainerStarted","Data":"f0109b21e2c24c5f94396b235394c0879720e01be5d4485d3d53135bfed18c43"} Sep 30 17:20:33 crc kubenswrapper[4772]: I0930 17:20:33.302986 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8jz82" Sep 30 17:20:33 crc kubenswrapper[4772]: I0930 17:20:33.302977 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8jz82" event={"ID":"1ffc8f04-1cf1-4653-925c-781c84770099","Type":"ContainerDied","Data":"633ab88a34db6924dd44ad5fc08af605fb39fe98619bf2d0ef68958828a519d3"} Sep 30 17:20:33 crc kubenswrapper[4772]: I0930 17:20:33.303123 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="633ab88a34db6924dd44ad5fc08af605fb39fe98619bf2d0ef68958828a519d3" Sep 30 17:20:33 crc kubenswrapper[4772]: I0930 17:20:33.304787 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bpsgs" event={"ID":"d32d405b-efef-42de-a2c0-7c047dbcbec3","Type":"ContainerDied","Data":"9488d433b5fda167ba70332ee9709707421c443d28979f0fc4f8e8ed97918857"} Sep 30 17:20:33 crc kubenswrapper[4772]: I0930 17:20:33.304828 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9488d433b5fda167ba70332ee9709707421c443d28979f0fc4f8e8ed97918857" Sep 30 17:20:33 crc kubenswrapper[4772]: I0930 17:20:33.304841 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bpsgs" Sep 30 17:20:34 crc kubenswrapper[4772]: I0930 17:20:34.691424 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:36 crc kubenswrapper[4772]: I0930 17:20:36.335415 4772 generic.go:334] "Generic (PLEG): container finished" podID="8344d58c-29c7-40c3-81bb-5fc3ad4ea02b" containerID="3b7fa39dfb0ec84ac5f5eee196c7d64ca902556cb1cc336ffff2d5fd35a6ff10" exitCode=0 Sep 30 17:20:36 crc kubenswrapper[4772]: I0930 17:20:36.335791 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tmd5b" event={"ID":"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b","Type":"ContainerDied","Data":"3b7fa39dfb0ec84ac5f5eee196c7d64ca902556cb1cc336ffff2d5fd35a6ff10"} Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.135546 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.231818 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwcnw\" (UniqueName: \"kubernetes.io/projected/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-kube-api-access-kwcnw\") pod \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.232151 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-config-data\") pod \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.232344 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-combined-ca-bundle\") pod \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\" (UID: \"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b\") " Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.240278 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-kube-api-access-kwcnw" (OuterVolumeSpecName: "kube-api-access-kwcnw") pod "8344d58c-29c7-40c3-81bb-5fc3ad4ea02b" (UID: "8344d58c-29c7-40c3-81bb-5fc3ad4ea02b"). InnerVolumeSpecName "kube-api-access-kwcnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.263435 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8344d58c-29c7-40c3-81bb-5fc3ad4ea02b" (UID: "8344d58c-29c7-40c3-81bb-5fc3ad4ea02b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.282378 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-config-data" (OuterVolumeSpecName: "config-data") pod "8344d58c-29c7-40c3-81bb-5fc3ad4ea02b" (UID: "8344d58c-29c7-40c3-81bb-5fc3ad4ea02b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.333986 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.334019 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwcnw\" (UniqueName: \"kubernetes.io/projected/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-kube-api-access-kwcnw\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.334031 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.352208 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tmd5b" event={"ID":"8344d58c-29c7-40c3-81bb-5fc3ad4ea02b","Type":"ContainerDied","Data":"718a4c75b3f73c9e8b307f102f3837fb64ab337137dae12ed7de4a93a20eb393"} Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.352249 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="718a4c75b3f73c9e8b307f102f3837fb64ab337137dae12ed7de4a93a20eb393" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.352264 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tmd5b" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.611907 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mgrkk"] Sep 30 17:20:38 crc kubenswrapper[4772]: E0930 17:20:38.612758 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ffc8f04-1cf1-4653-925c-781c84770099" containerName="mariadb-database-create" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.612777 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ffc8f04-1cf1-4653-925c-781c84770099" containerName="mariadb-database-create" Sep 30 17:20:38 crc kubenswrapper[4772]: E0930 17:20:38.612789 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8344d58c-29c7-40c3-81bb-5fc3ad4ea02b" containerName="keystone-db-sync" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.612796 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8344d58c-29c7-40c3-81bb-5fc3ad4ea02b" containerName="keystone-db-sync" Sep 30 17:20:38 crc kubenswrapper[4772]: E0930 17:20:38.612822 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44" containerName="mariadb-database-create" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.612828 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44" containerName="mariadb-database-create" Sep 30 17:20:38 crc kubenswrapper[4772]: E0930 17:20:38.612846 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d32d405b-efef-42de-a2c0-7c047dbcbec3" containerName="mariadb-database-create" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.612899 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32d405b-efef-42de-a2c0-7c047dbcbec3" containerName="mariadb-database-create" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.613297 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d32d405b-efef-42de-a2c0-7c047dbcbec3" containerName="mariadb-database-create" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.613317 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44" containerName="mariadb-database-create" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.613328 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8344d58c-29c7-40c3-81bb-5fc3ad4ea02b" containerName="keystone-db-sync" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.613348 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ffc8f04-1cf1-4653-925c-781c84770099" containerName="mariadb-database-create" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.613888 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.619581 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fsmbb" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.619881 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.620081 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.620630 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.629107 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mgrkk"] Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.638915 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-fernet-keys\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.638976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-config-data\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.639040 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-scripts\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.639126 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-credential-keys\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.639157 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-combined-ca-bundle\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.639197 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzr7x\" (UniqueName: \"kubernetes.io/projected/41dd2191-6b9d-46d5-a9a4-53f6324a915c-kube-api-access-hzr7x\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.650707 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f68f479bf-6x75v"] Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.652228 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.657279 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.657344 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.668633 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f68f479bf-6x75v"] Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741291 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-dns-svc\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741622 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-fernet-keys\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741656 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-config-data\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741678 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-scripts\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741706 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-config\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741737 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vxvr\" (UniqueName: \"kubernetes.io/projected/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-kube-api-access-7vxvr\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741769 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-credential-keys\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741806 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-sb\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741822 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-combined-ca-bundle\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741846 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-nb\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.741883 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzr7x\" (UniqueName: \"kubernetes.io/projected/41dd2191-6b9d-46d5-a9a4-53f6324a915c-kube-api-access-hzr7x\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.746050 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-config-data\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.751116 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-scripts\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.758779 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-credential-keys\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.767464 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-combined-ca-bundle\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.769738 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzr7x\" (UniqueName: \"kubernetes.io/projected/41dd2191-6b9d-46d5-a9a4-53f6324a915c-kube-api-access-hzr7x\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.771945 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-fernet-keys\") pod \"keystone-bootstrap-mgrkk\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.813849 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.816004 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.826607 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.831617 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843345 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-sb\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843396 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-nb\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843501 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-log-httpd\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843526 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-config-data\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843543 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4r2v\" (UniqueName: \"kubernetes.io/projected/1be5891a-e27f-4f51-868f-90a7ade7d4bb-kube-api-access-m4r2v\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843566 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-run-httpd\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843589 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-dns-svc\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843622 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843648 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-scripts\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843712 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-config\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843733 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.843769 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vxvr\" (UniqueName: \"kubernetes.io/projected/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-kube-api-access-7vxvr\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.845186 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-sb\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.847864 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-nb\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.848363 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.852001 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-dns-svc\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.853526 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-config\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.871482 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vxvr\" (UniqueName: \"kubernetes.io/projected/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-kube-api-access-7vxvr\") pod \"dnsmasq-dns-6f68f479bf-6x75v\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.913885 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a3ad-account-create-tv29q"] Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.915253 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a3ad-account-create-tv29q" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.917882 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.936703 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.948033 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a3ad-account-create-tv29q"] Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.950022 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-log-httpd\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.950095 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-config-data\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.950137 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4r2v\" (UniqueName: \"kubernetes.io/projected/1be5891a-e27f-4f51-868f-90a7ade7d4bb-kube-api-access-m4r2v\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.950163 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-run-httpd\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.950198 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.950244 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-scripts\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.950326 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.950453 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btdj2\" (UniqueName: \"kubernetes.io/projected/0c6026a4-443b-44c1-8390-d1958c7b4f92-kube-api-access-btdj2\") pod \"barbican-a3ad-account-create-tv29q\" (UID: \"0c6026a4-443b-44c1-8390-d1958c7b4f92\") " pod="openstack/barbican-a3ad-account-create-tv29q" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.950945 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-log-httpd\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.959100 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-run-httpd\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.960838 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-config-data\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.961159 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-scripts\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.963575 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.968899 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.981359 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4r2v\" (UniqueName: \"kubernetes.io/projected/1be5891a-e27f-4f51-868f-90a7ade7d4bb-kube-api-access-m4r2v\") pod \"ceilometer-0\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " pod="openstack/ceilometer-0" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.985479 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:38 crc kubenswrapper[4772]: I0930 17:20:38.985983 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-464a-account-create-r86r7"] Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.008157 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-464a-account-create-r86r7" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.010573 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-464a-account-create-r86r7"] Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.016349 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.049143 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f68f479bf-6x75v"] Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.054552 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kkmw\" (UniqueName: \"kubernetes.io/projected/2f3b76f5-2040-45f5-ae68-118f4399738b-kube-api-access-6kkmw\") pod \"cinder-464a-account-create-r86r7\" (UID: \"2f3b76f5-2040-45f5-ae68-118f4399738b\") " pod="openstack/cinder-464a-account-create-r86r7" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.054658 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btdj2\" (UniqueName: \"kubernetes.io/projected/0c6026a4-443b-44c1-8390-d1958c7b4f92-kube-api-access-btdj2\") pod \"barbican-a3ad-account-create-tv29q\" (UID: \"0c6026a4-443b-44c1-8390-d1958c7b4f92\") " pod="openstack/barbican-a3ad-account-create-tv29q" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.095261 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-769468b997-d9swz"] Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.096746 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.101980 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btdj2\" (UniqueName: \"kubernetes.io/projected/0c6026a4-443b-44c1-8390-d1958c7b4f92-kube-api-access-btdj2\") pod \"barbican-a3ad-account-create-tv29q\" (UID: \"0c6026a4-443b-44c1-8390-d1958c7b4f92\") " pod="openstack/barbican-a3ad-account-create-tv29q" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.131919 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-769468b997-d9swz"] Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.152655 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-l59ml"] Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.155646 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.156601 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-sb\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.156666 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qswgc\" (UniqueName: \"kubernetes.io/projected/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-kube-api-access-qswgc\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.156781 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-config\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.156823 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-nb\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.156875 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-dns-svc\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.156912 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kkmw\" (UniqueName: \"kubernetes.io/projected/2f3b76f5-2040-45f5-ae68-118f4399738b-kube-api-access-6kkmw\") pod \"cinder-464a-account-create-r86r7\" (UID: \"2f3b76f5-2040-45f5-ae68-118f4399738b\") " pod="openstack/cinder-464a-account-create-r86r7" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.158334 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.158555 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.158685 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nhw6m" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.159562 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-l59ml"] Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.173557 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.192529 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kkmw\" (UniqueName: \"kubernetes.io/projected/2f3b76f5-2040-45f5-ae68-118f4399738b-kube-api-access-6kkmw\") pod \"cinder-464a-account-create-r86r7\" (UID: \"2f3b76f5-2040-45f5-ae68-118f4399738b\") " pod="openstack/cinder-464a-account-create-r86r7" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.261547 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-567b-account-create-957g7"] Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.265031 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-dns-svc\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.265158 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-sb\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.265202 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qswgc\" (UniqueName: \"kubernetes.io/projected/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-kube-api-access-qswgc\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.265257 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-config\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.265295 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-nb\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.266491 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-567b-account-create-957g7" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.266591 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-nb\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.267227 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-dns-svc\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.267753 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-sb\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.268765 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-config\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.270823 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.272012 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a3ad-account-create-tv29q" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.291467 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-567b-account-create-957g7"] Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.295938 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qswgc\" (UniqueName: \"kubernetes.io/projected/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-kube-api-access-qswgc\") pod \"dnsmasq-dns-769468b997-d9swz\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.343448 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.352612 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-464a-account-create-r86r7" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.386414 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-config-data\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.386657 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-scripts\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.386716 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-combined-ca-bundle\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.386804 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e531fe5c-574b-4894-b491-a46e9892d380-logs\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.386991 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9wt6\" (UniqueName: \"kubernetes.io/projected/e531fe5c-574b-4894-b491-a46e9892d380-kube-api-access-r9wt6\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.387099 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sjk5\" (UniqueName: \"kubernetes.io/projected/87675da0-8050-4d05-bc27-0c8e519a83c4-kube-api-access-8sjk5\") pod \"neutron-567b-account-create-957g7\" (UID: \"87675da0-8050-4d05-bc27-0c8e519a83c4\") " pod="openstack/neutron-567b-account-create-957g7" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.442493 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-pf2kh" event={"ID":"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7","Type":"ContainerStarted","Data":"9c601db0ca9f168742c47a0a599aac8b2dff2aa4a71a70b9f1dfe67a7de29867"} Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.472820 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-pf2kh" podStartSLOduration=5.250214079 podStartE2EDuration="11.472801661s" podCreationTimestamp="2025-09-30 17:20:28 +0000 UTC" firstStartedPulling="2025-09-30 17:20:32.447097247 +0000 UTC m=+1133.354110078" lastFinishedPulling="2025-09-30 17:20:38.669684829 +0000 UTC m=+1139.576697660" observedRunningTime="2025-09-30 17:20:39.466665424 +0000 UTC m=+1140.373678255" watchObservedRunningTime="2025-09-30 17:20:39.472801661 +0000 UTC m=+1140.379814492" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.493782 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9wt6\" (UniqueName: \"kubernetes.io/projected/e531fe5c-574b-4894-b491-a46e9892d380-kube-api-access-r9wt6\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.493847 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sjk5\" (UniqueName: \"kubernetes.io/projected/87675da0-8050-4d05-bc27-0c8e519a83c4-kube-api-access-8sjk5\") pod \"neutron-567b-account-create-957g7\" (UID: \"87675da0-8050-4d05-bc27-0c8e519a83c4\") " pod="openstack/neutron-567b-account-create-957g7" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.493878 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-config-data\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.494026 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-scripts\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.494072 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-combined-ca-bundle\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.494120 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e531fe5c-574b-4894-b491-a46e9892d380-logs\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.496605 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e531fe5c-574b-4894-b491-a46e9892d380-logs\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.499544 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-scripts\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.499976 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-combined-ca-bundle\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.501408 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-config-data\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.520770 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sjk5\" (UniqueName: \"kubernetes.io/projected/87675da0-8050-4d05-bc27-0c8e519a83c4-kube-api-access-8sjk5\") pod \"neutron-567b-account-create-957g7\" (UID: \"87675da0-8050-4d05-bc27-0c8e519a83c4\") " pod="openstack/neutron-567b-account-create-957g7" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.524549 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9wt6\" (UniqueName: \"kubernetes.io/projected/e531fe5c-574b-4894-b491-a46e9892d380-kube-api-access-r9wt6\") pod \"placement-db-sync-l59ml\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.690640 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.691134 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.691386 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-567b-account-create-957g7" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.708464 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.710334 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mgrkk"] Sep 30 17:20:39 crc kubenswrapper[4772]: W0930 17:20:39.767934 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41dd2191_6b9d_46d5_a9a4_53f6324a915c.slice/crio-55e2843d638ba480bb61046eadc351e1de27fc75bd6190eb4e3c79d0e0a759dc WatchSource:0}: Error finding container 55e2843d638ba480bb61046eadc351e1de27fc75bd6190eb4e3c79d0e0a759dc: Status 404 returned error can't find the container with id 55e2843d638ba480bb61046eadc351e1de27fc75bd6190eb4e3c79d0e0a759dc Sep 30 17:20:39 crc kubenswrapper[4772]: I0930 17:20:39.873123 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f68f479bf-6x75v"] Sep 30 17:20:39 crc kubenswrapper[4772]: W0930 17:20:39.914933 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded88a0aa_7513_4dfa_8a0a_3046d3ebe321.slice/crio-e2c48d240863bf0bfac148deccda449744b13b79499ec78968be6f1c3587a43a WatchSource:0}: Error finding container e2c48d240863bf0bfac148deccda449744b13b79499ec78968be6f1c3587a43a: Status 404 returned error can't find the container with id e2c48d240863bf0bfac148deccda449744b13b79499ec78968be6f1c3587a43a Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.076076 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.235945 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-769468b997-d9swz"] Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.244243 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a3ad-account-create-tv29q"] Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.252279 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-464a-account-create-r86r7"] Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.272005 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.311246 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.472508 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-567b-account-create-957g7"] Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.472943 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-769468b997-d9swz" event={"ID":"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f","Type":"ContainerStarted","Data":"58a23f12b3d4c938d46275caee36fd116ec8fa07e06766f6fdbd8dc3fa0ff19c"} Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.484721 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-l59ml"] Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.489758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" event={"ID":"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321","Type":"ContainerStarted","Data":"e2c48d240863bf0bfac148deccda449744b13b79499ec78968be6f1c3587a43a"} Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.495071 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-464a-account-create-r86r7" event={"ID":"2f3b76f5-2040-45f5-ae68-118f4399738b","Type":"ContainerStarted","Data":"b772d8c73ca8686ffda913126124b07d9d1f35cb1aea0343bcc5e38e2a26d708"} Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.498155 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pwm7r" event={"ID":"523a44fe-7e63-47a7-9b9d-4e272994dce1","Type":"ContainerStarted","Data":"901f6614fabc8f4551b399a76786ff27bf1ae43ea4b295635749f22328a69dd0"} Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.510579 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mgrkk" event={"ID":"41dd2191-6b9d-46d5-a9a4-53f6324a915c","Type":"ContainerStarted","Data":"1a4133fb7ad3b9cf4ed58bf53c24a7060075d68fbba85ede3c954f922dc6e8e5"} Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.510625 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mgrkk" event={"ID":"41dd2191-6b9d-46d5-a9a4-53f6324a915c","Type":"ContainerStarted","Data":"55e2843d638ba480bb61046eadc351e1de27fc75bd6190eb4e3c79d0e0a759dc"} Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.515372 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a3ad-account-create-tv29q" event={"ID":"0c6026a4-443b-44c1-8390-d1958c7b4f92","Type":"ContainerStarted","Data":"a59c3c71437353ad1f94a09821fc13b74b321465b39a6f012a12d064cb7fd376"} Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.518663 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be5891a-e27f-4f51-868f-90a7ade7d4bb","Type":"ContainerStarted","Data":"e4b60028358e320573e578b96a01f1b04a5d1f40967c8532f3070b35bc16e2ce"} Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.520119 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pwm7r" podStartSLOduration=3.130343962 podStartE2EDuration="34.520105634s" podCreationTimestamp="2025-09-30 17:20:06 +0000 UTC" firstStartedPulling="2025-09-30 17:20:07.273560165 +0000 UTC m=+1108.180572996" lastFinishedPulling="2025-09-30 17:20:38.663321837 +0000 UTC m=+1139.570334668" observedRunningTime="2025-09-30 17:20:40.516452951 +0000 UTC m=+1141.423465782" watchObservedRunningTime="2025-09-30 17:20:40.520105634 +0000 UTC m=+1141.427118465" Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.523961 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.553382 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mgrkk" podStartSLOduration=2.553359074 podStartE2EDuration="2.553359074s" podCreationTimestamp="2025-09-30 17:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:40.544273042 +0000 UTC m=+1141.451285873" watchObservedRunningTime="2025-09-30 17:20:40.553359074 +0000 UTC m=+1141.460371905" Sep 30 17:20:40 crc kubenswrapper[4772]: W0930 17:20:40.593439 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode531fe5c_574b_4894_b491_a46e9892d380.slice/crio-164ea7ff480b3ee9f92f720ba92d7b5364133919e5b4a66cae376e741a5ffb28 WatchSource:0}: Error finding container 164ea7ff480b3ee9f92f720ba92d7b5364133919e5b4a66cae376e741a5ffb28: Status 404 returned error can't find the container with id 164ea7ff480b3ee9f92f720ba92d7b5364133919e5b4a66cae376e741a5ffb28 Sep 30 17:20:40 crc kubenswrapper[4772]: I0930 17:20:40.594768 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.308856 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.537501 4772 generic.go:334] "Generic (PLEG): container finished" podID="0c6026a4-443b-44c1-8390-d1958c7b4f92" containerID="e603d5d35581553f301c390d087917b97aff634e438c61d654975d7104445ff7" exitCode=0 Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.537581 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a3ad-account-create-tv29q" event={"ID":"0c6026a4-443b-44c1-8390-d1958c7b4f92","Type":"ContainerDied","Data":"e603d5d35581553f301c390d087917b97aff634e438c61d654975d7104445ff7"} Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.542659 4772 generic.go:334] "Generic (PLEG): container finished" podID="bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" containerID="6e6daa3966de40c8e5286931751cadbf06231f13e908509b97fba5a921203603" exitCode=0 Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.542735 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-769468b997-d9swz" event={"ID":"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f","Type":"ContainerDied","Data":"6e6daa3966de40c8e5286931751cadbf06231f13e908509b97fba5a921203603"} Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.545906 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l59ml" event={"ID":"e531fe5c-574b-4894-b491-a46e9892d380","Type":"ContainerStarted","Data":"164ea7ff480b3ee9f92f720ba92d7b5364133919e5b4a66cae376e741a5ffb28"} Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.554028 4772 generic.go:334] "Generic (PLEG): container finished" podID="ed88a0aa-7513-4dfa-8a0a-3046d3ebe321" containerID="ea64b9b0354250ab56653e810c8bbd18dd83877af56dec2ad9ac8d0a9f15f2fd" exitCode=0 Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.554291 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" event={"ID":"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321","Type":"ContainerDied","Data":"ea64b9b0354250ab56653e810c8bbd18dd83877af56dec2ad9ac8d0a9f15f2fd"} Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.568559 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-567b-account-create-957g7" event={"ID":"87675da0-8050-4d05-bc27-0c8e519a83c4","Type":"ContainerStarted","Data":"94eb884ebb25f57aedf87f75893a3b9cb1671eb96a2718f064598d23010344b9"} Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.568617 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-567b-account-create-957g7" event={"ID":"87675da0-8050-4d05-bc27-0c8e519a83c4","Type":"ContainerStarted","Data":"66cc31b43354febed86150543cac866593bf11939c2b70c1518a6a0d81a10476"} Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.643200 4772 generic.go:334] "Generic (PLEG): container finished" podID="2f3b76f5-2040-45f5-ae68-118f4399738b" containerID="81995ac3da0930e34bed27a2b61146b6f1cf93a88c5de50e3f2249108d0cf3d5" exitCode=0 Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.646283 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-464a-account-create-r86r7" event={"ID":"2f3b76f5-2040-45f5-ae68-118f4399738b","Type":"ContainerDied","Data":"81995ac3da0930e34bed27a2b61146b6f1cf93a88c5de50e3f2249108d0cf3d5"} Sep 30 17:20:41 crc kubenswrapper[4772]: I0930 17:20:41.667717 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-567b-account-create-957g7" podStartSLOduration=2.6677010599999997 podStartE2EDuration="2.66770106s" podCreationTimestamp="2025-09-30 17:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:41.658868584 +0000 UTC m=+1142.565881415" watchObservedRunningTime="2025-09-30 17:20:41.66770106 +0000 UTC m=+1142.574713891" Sep 30 17:20:42 crc kubenswrapper[4772]: I0930 17:20:42.916800 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.026027 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-sb\") pod \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.026079 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vxvr\" (UniqueName: \"kubernetes.io/projected/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-kube-api-access-7vxvr\") pod \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.026857 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-config\") pod \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.026982 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-dns-svc\") pod \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.027015 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-nb\") pod \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\" (UID: \"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321\") " Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.040817 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-kube-api-access-7vxvr" (OuterVolumeSpecName: "kube-api-access-7vxvr") pod "ed88a0aa-7513-4dfa-8a0a-3046d3ebe321" (UID: "ed88a0aa-7513-4dfa-8a0a-3046d3ebe321"). InnerVolumeSpecName "kube-api-access-7vxvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.068875 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed88a0aa-7513-4dfa-8a0a-3046d3ebe321" (UID: "ed88a0aa-7513-4dfa-8a0a-3046d3ebe321"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.072944 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed88a0aa-7513-4dfa-8a0a-3046d3ebe321" (UID: "ed88a0aa-7513-4dfa-8a0a-3046d3ebe321"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.073861 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed88a0aa-7513-4dfa-8a0a-3046d3ebe321" (UID: "ed88a0aa-7513-4dfa-8a0a-3046d3ebe321"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.093796 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-config" (OuterVolumeSpecName: "config") pod "ed88a0aa-7513-4dfa-8a0a-3046d3ebe321" (UID: "ed88a0aa-7513-4dfa-8a0a-3046d3ebe321"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.129716 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.129947 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.130037 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.130133 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.130221 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vxvr\" (UniqueName: \"kubernetes.io/projected/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321-kube-api-access-7vxvr\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.206405 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a3ad-account-create-tv29q" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.220674 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-464a-account-create-r86r7" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.333850 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btdj2\" (UniqueName: \"kubernetes.io/projected/0c6026a4-443b-44c1-8390-d1958c7b4f92-kube-api-access-btdj2\") pod \"0c6026a4-443b-44c1-8390-d1958c7b4f92\" (UID: \"0c6026a4-443b-44c1-8390-d1958c7b4f92\") " Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.333946 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kkmw\" (UniqueName: \"kubernetes.io/projected/2f3b76f5-2040-45f5-ae68-118f4399738b-kube-api-access-6kkmw\") pod \"2f3b76f5-2040-45f5-ae68-118f4399738b\" (UID: \"2f3b76f5-2040-45f5-ae68-118f4399738b\") " Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.341177 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6026a4-443b-44c1-8390-d1958c7b4f92-kube-api-access-btdj2" (OuterVolumeSpecName: "kube-api-access-btdj2") pod "0c6026a4-443b-44c1-8390-d1958c7b4f92" (UID: "0c6026a4-443b-44c1-8390-d1958c7b4f92"). InnerVolumeSpecName "kube-api-access-btdj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.341255 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f3b76f5-2040-45f5-ae68-118f4399738b-kube-api-access-6kkmw" (OuterVolumeSpecName: "kube-api-access-6kkmw") pod "2f3b76f5-2040-45f5-ae68-118f4399738b" (UID: "2f3b76f5-2040-45f5-ae68-118f4399738b"). InnerVolumeSpecName "kube-api-access-6kkmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.436565 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btdj2\" (UniqueName: \"kubernetes.io/projected/0c6026a4-443b-44c1-8390-d1958c7b4f92-kube-api-access-btdj2\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.436600 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kkmw\" (UniqueName: \"kubernetes.io/projected/2f3b76f5-2040-45f5-ae68-118f4399738b-kube-api-access-6kkmw\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.679690 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-464a-account-create-r86r7" event={"ID":"2f3b76f5-2040-45f5-ae68-118f4399738b","Type":"ContainerDied","Data":"b772d8c73ca8686ffda913126124b07d9d1f35cb1aea0343bcc5e38e2a26d708"} Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.679728 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b772d8c73ca8686ffda913126124b07d9d1f35cb1aea0343bcc5e38e2a26d708" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.679736 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-464a-account-create-r86r7" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.683775 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a3ad-account-create-tv29q" event={"ID":"0c6026a4-443b-44c1-8390-d1958c7b4f92","Type":"ContainerDied","Data":"a59c3c71437353ad1f94a09821fc13b74b321465b39a6f012a12d064cb7fd376"} Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.683820 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a59c3c71437353ad1f94a09821fc13b74b321465b39a6f012a12d064cb7fd376" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.683842 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a3ad-account-create-tv29q" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.687601 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-769468b997-d9swz" event={"ID":"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f","Type":"ContainerStarted","Data":"35947ac0a483794405a47f2bfd42a5c79f0152e322d704968d6e1a38c279aa24"} Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.687782 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.691906 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" event={"ID":"ed88a0aa-7513-4dfa-8a0a-3046d3ebe321","Type":"ContainerDied","Data":"e2c48d240863bf0bfac148deccda449744b13b79499ec78968be6f1c3587a43a"} Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.691961 4772 scope.go:117] "RemoveContainer" containerID="ea64b9b0354250ab56653e810c8bbd18dd83877af56dec2ad9ac8d0a9f15f2fd" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.692083 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f68f479bf-6x75v" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.693858 4772 generic.go:334] "Generic (PLEG): container finished" podID="87675da0-8050-4d05-bc27-0c8e519a83c4" containerID="94eb884ebb25f57aedf87f75893a3b9cb1671eb96a2718f064598d23010344b9" exitCode=0 Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.693929 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-567b-account-create-957g7" event={"ID":"87675da0-8050-4d05-bc27-0c8e519a83c4","Type":"ContainerDied","Data":"94eb884ebb25f57aedf87f75893a3b9cb1671eb96a2718f064598d23010344b9"} Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.730626 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-769468b997-d9swz" podStartSLOduration=4.7306001559999995 podStartE2EDuration="4.730600156s" podCreationTimestamp="2025-09-30 17:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:43.707003422 +0000 UTC m=+1144.614016253" watchObservedRunningTime="2025-09-30 17:20:43.730600156 +0000 UTC m=+1144.637612997" Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.846228 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f68f479bf-6x75v"] Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.852814 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f68f479bf-6x75v"] Sep 30 17:20:43 crc kubenswrapper[4772]: I0930 17:20:43.907796 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed88a0aa-7513-4dfa-8a0a-3046d3ebe321" path="/var/lib/kubelet/pods/ed88a0aa-7513-4dfa-8a0a-3046d3ebe321/volumes" Sep 30 17:20:45 crc kubenswrapper[4772]: I0930 17:20:45.717690 4772 generic.go:334] "Generic (PLEG): container finished" podID="7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7" containerID="9c601db0ca9f168742c47a0a599aac8b2dff2aa4a71a70b9f1dfe67a7de29867" exitCode=0 Sep 30 17:20:45 crc kubenswrapper[4772]: I0930 17:20:45.717783 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-pf2kh" event={"ID":"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7","Type":"ContainerDied","Data":"9c601db0ca9f168742c47a0a599aac8b2dff2aa4a71a70b9f1dfe67a7de29867"} Sep 30 17:20:46 crc kubenswrapper[4772]: I0930 17:20:46.728299 4772 generic.go:334] "Generic (PLEG): container finished" podID="41dd2191-6b9d-46d5-a9a4-53f6324a915c" containerID="1a4133fb7ad3b9cf4ed58bf53c24a7060075d68fbba85ede3c954f922dc6e8e5" exitCode=0 Sep 30 17:20:46 crc kubenswrapper[4772]: I0930 17:20:46.728475 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mgrkk" event={"ID":"41dd2191-6b9d-46d5-a9a4-53f6324a915c","Type":"ContainerDied","Data":"1a4133fb7ad3b9cf4ed58bf53c24a7060075d68fbba85ede3c954f922dc6e8e5"} Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.461603 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-567b-account-create-957g7" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.487246 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.629522 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-db-sync-config-data\") pod \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.629764 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdrc2\" (UniqueName: \"kubernetes.io/projected/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-kube-api-access-jdrc2\") pod \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.629860 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-combined-ca-bundle\") pod \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.629951 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sjk5\" (UniqueName: \"kubernetes.io/projected/87675da0-8050-4d05-bc27-0c8e519a83c4-kube-api-access-8sjk5\") pod \"87675da0-8050-4d05-bc27-0c8e519a83c4\" (UID: \"87675da0-8050-4d05-bc27-0c8e519a83c4\") " Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.630121 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-config-data\") pod \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\" (UID: \"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7\") " Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.634331 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7" (UID: "7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.634506 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-kube-api-access-jdrc2" (OuterVolumeSpecName: "kube-api-access-jdrc2") pod "7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7" (UID: "7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7"). InnerVolumeSpecName "kube-api-access-jdrc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.636316 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87675da0-8050-4d05-bc27-0c8e519a83c4-kube-api-access-8sjk5" (OuterVolumeSpecName: "kube-api-access-8sjk5") pod "87675da0-8050-4d05-bc27-0c8e519a83c4" (UID: "87675da0-8050-4d05-bc27-0c8e519a83c4"). InnerVolumeSpecName "kube-api-access-8sjk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.669844 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7" (UID: "7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.691627 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-config-data" (OuterVolumeSpecName: "config-data") pod "7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7" (UID: "7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.732708 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.732741 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sjk5\" (UniqueName: \"kubernetes.io/projected/87675da0-8050-4d05-bc27-0c8e519a83c4-kube-api-access-8sjk5\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.732752 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.732760 4772 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.732768 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdrc2\" (UniqueName: \"kubernetes.io/projected/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7-kube-api-access-jdrc2\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.740803 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-567b-account-create-957g7" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.742319 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-567b-account-create-957g7" event={"ID":"87675da0-8050-4d05-bc27-0c8e519a83c4","Type":"ContainerDied","Data":"66cc31b43354febed86150543cac866593bf11939c2b70c1518a6a0d81a10476"} Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.742391 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66cc31b43354febed86150543cac866593bf11939c2b70c1518a6a0d81a10476" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.753961 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-pf2kh" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.753979 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-pf2kh" event={"ID":"7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7","Type":"ContainerDied","Data":"f0109b21e2c24c5f94396b235394c0879720e01be5d4485d3d53135bfed18c43"} Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.754119 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0109b21e2c24c5f94396b235394c0879720e01be5d4485d3d53135bfed18c43" Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.776185 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be5891a-e27f-4f51-868f-90a7ade7d4bb","Type":"ContainerStarted","Data":"55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7"} Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.783354 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l59ml" event={"ID":"e531fe5c-574b-4894-b491-a46e9892d380","Type":"ContainerStarted","Data":"497209c744bd058efc3c7294be6072f96bcb866cda13531d017880cd7e07b1e9"} Sep 30 17:20:47 crc kubenswrapper[4772]: I0930 17:20:47.808475 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-l59ml" podStartSLOduration=2.257724704 podStartE2EDuration="8.808449771s" podCreationTimestamp="2025-09-30 17:20:39 +0000 UTC" firstStartedPulling="2025-09-30 17:20:40.643460977 +0000 UTC m=+1141.550473808" lastFinishedPulling="2025-09-30 17:20:47.194186054 +0000 UTC m=+1148.101198875" observedRunningTime="2025-09-30 17:20:47.797555482 +0000 UTC m=+1148.704568313" watchObservedRunningTime="2025-09-30 17:20:47.808449771 +0000 UTC m=+1148.715462602" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.045678 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 17:20:48 crc kubenswrapper[4772]: E0930 17:20:48.046412 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6026a4-443b-44c1-8390-d1958c7b4f92" containerName="mariadb-account-create" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.046429 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6026a4-443b-44c1-8390-d1958c7b4f92" containerName="mariadb-account-create" Sep 30 17:20:48 crc kubenswrapper[4772]: E0930 17:20:48.046449 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed88a0aa-7513-4dfa-8a0a-3046d3ebe321" containerName="init" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.046455 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed88a0aa-7513-4dfa-8a0a-3046d3ebe321" containerName="init" Sep 30 17:20:48 crc kubenswrapper[4772]: E0930 17:20:48.046474 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7" containerName="watcher-db-sync" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.046480 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7" containerName="watcher-db-sync" Sep 30 17:20:48 crc kubenswrapper[4772]: E0930 17:20:48.046488 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3b76f5-2040-45f5-ae68-118f4399738b" containerName="mariadb-account-create" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.046494 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3b76f5-2040-45f5-ae68-118f4399738b" containerName="mariadb-account-create" Sep 30 17:20:48 crc kubenswrapper[4772]: E0930 17:20:48.046503 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87675da0-8050-4d05-bc27-0c8e519a83c4" containerName="mariadb-account-create" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.046510 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="87675da0-8050-4d05-bc27-0c8e519a83c4" containerName="mariadb-account-create" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.046696 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6026a4-443b-44c1-8390-d1958c7b4f92" containerName="mariadb-account-create" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.046710 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="87675da0-8050-4d05-bc27-0c8e519a83c4" containerName="mariadb-account-create" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.046723 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f3b76f5-2040-45f5-ae68-118f4399738b" containerName="mariadb-account-create" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.046734 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7" containerName="watcher-db-sync" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.046745 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed88a0aa-7513-4dfa-8a0a-3046d3ebe321" containerName="init" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.047700 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.051573 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-5kwnw" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.051565 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.058799 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.115011 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.116902 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.121007 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.141961 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-logs\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.142009 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-config-data\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.142044 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.142102 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.142165 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4xk\" (UniqueName: \"kubernetes.io/projected/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-kube-api-access-hd4xk\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.153014 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.161309 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.162998 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.168427 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.179307 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.243784 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf22b9ce-256e-4ba4-95ba-53778c010876-logs\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.243871 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-logs\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.243891 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-config-data\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.243926 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.243954 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf22b9ce-256e-4ba4-95ba-53778c010876-config-data\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.243971 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.244030 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4xk\" (UniqueName: \"kubernetes.io/projected/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-kube-api-access-hd4xk\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.244085 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf22b9ce-256e-4ba4-95ba-53778c010876-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.244104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj96v\" (UniqueName: \"kubernetes.io/projected/bf22b9ce-256e-4ba4-95ba-53778c010876-kube-api-access-qj96v\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.247452 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-logs\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.252153 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-config-data\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.253332 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.253997 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.261770 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4xk\" (UniqueName: \"kubernetes.io/projected/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-kube-api-access-hd4xk\") pod \"watcher-api-0\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.345124 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f02322-0ff1-410e-8b46-dd3b5f909963-config-data\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.345272 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf22b9ce-256e-4ba4-95ba-53778c010876-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.345331 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj96v\" (UniqueName: \"kubernetes.io/projected/bf22b9ce-256e-4ba4-95ba-53778c010876-kube-api-access-qj96v\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.345465 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf22b9ce-256e-4ba4-95ba-53778c010876-logs\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.345575 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69f02322-0ff1-410e-8b46-dd3b5f909963-logs\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.345640 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf22b9ce-256e-4ba4-95ba-53778c010876-config-data\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.345667 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/69f02322-0ff1-410e-8b46-dd3b5f909963-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.345694 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkxmd\" (UniqueName: \"kubernetes.io/projected/69f02322-0ff1-410e-8b46-dd3b5f909963-kube-api-access-lkxmd\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.345733 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f02322-0ff1-410e-8b46-dd3b5f909963-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.346342 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf22b9ce-256e-4ba4-95ba-53778c010876-logs\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.362693 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf22b9ce-256e-4ba4-95ba-53778c010876-config-data\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.363601 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj96v\" (UniqueName: \"kubernetes.io/projected/bf22b9ce-256e-4ba4-95ba-53778c010876-kube-api-access-qj96v\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.380107 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.385821 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf22b9ce-256e-4ba4-95ba-53778c010876-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"bf22b9ce-256e-4ba4-95ba-53778c010876\") " pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.447312 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f02322-0ff1-410e-8b46-dd3b5f909963-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.447387 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f02322-0ff1-410e-8b46-dd3b5f909963-config-data\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.447481 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69f02322-0ff1-410e-8b46-dd3b5f909963-logs\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.447514 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/69f02322-0ff1-410e-8b46-dd3b5f909963-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.447532 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkxmd\" (UniqueName: \"kubernetes.io/projected/69f02322-0ff1-410e-8b46-dd3b5f909963-kube-api-access-lkxmd\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.448708 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69f02322-0ff1-410e-8b46-dd3b5f909963-logs\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.454038 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.456685 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/69f02322-0ff1-410e-8b46-dd3b5f909963-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.458927 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f02322-0ff1-410e-8b46-dd3b5f909963-config-data\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.459505 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f02322-0ff1-410e-8b46-dd3b5f909963-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.472353 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkxmd\" (UniqueName: \"kubernetes.io/projected/69f02322-0ff1-410e-8b46-dd3b5f909963-kube-api-access-lkxmd\") pod \"watcher-decision-engine-0\" (UID: \"69f02322-0ff1-410e-8b46-dd3b5f909963\") " pod="openstack/watcher-decision-engine-0" Sep 30 17:20:48 crc kubenswrapper[4772]: I0930 17:20:48.532618 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.233364 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-f85ns"] Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.242903 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.245899 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.246295 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5qb4b" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.246979 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.259232 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.259921 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f85ns"] Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.350537 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.371983 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzr7x\" (UniqueName: \"kubernetes.io/projected/41dd2191-6b9d-46d5-a9a4-53f6324a915c-kube-api-access-hzr7x\") pod \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372051 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-credential-keys\") pod \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372131 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-scripts\") pod \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372172 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-config-data\") pod \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372248 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-combined-ca-bundle\") pod \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372282 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-fernet-keys\") pod \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\" (UID: \"41dd2191-6b9d-46d5-a9a4-53f6324a915c\") " Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372525 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-combined-ca-bundle\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372631 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-db-sync-config-data\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372661 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7633806b-c365-4597-b298-1e9767c640d4-etc-machine-id\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372686 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-scripts\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372763 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-config-data\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.372786 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mp98\" (UniqueName: \"kubernetes.io/projected/7633806b-c365-4597-b298-1e9767c640d4-kube-api-access-8mp98\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.396731 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41dd2191-6b9d-46d5-a9a4-53f6324a915c-kube-api-access-hzr7x" (OuterVolumeSpecName: "kube-api-access-hzr7x") pod "41dd2191-6b9d-46d5-a9a4-53f6324a915c" (UID: "41dd2191-6b9d-46d5-a9a4-53f6324a915c"). InnerVolumeSpecName "kube-api-access-hzr7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.413125 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-scripts" (OuterVolumeSpecName: "scripts") pod "41dd2191-6b9d-46d5-a9a4-53f6324a915c" (UID: "41dd2191-6b9d-46d5-a9a4-53f6324a915c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.432180 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "41dd2191-6b9d-46d5-a9a4-53f6324a915c" (UID: "41dd2191-6b9d-46d5-a9a4-53f6324a915c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.438220 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "41dd2191-6b9d-46d5-a9a4-53f6324a915c" (UID: "41dd2191-6b9d-46d5-a9a4-53f6324a915c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.438304 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wfrg6"] Sep 30 17:20:49 crc kubenswrapper[4772]: E0930 17:20:49.439094 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41dd2191-6b9d-46d5-a9a4-53f6324a915c" containerName="keystone-bootstrap" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.439110 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="41dd2191-6b9d-46d5-a9a4-53f6324a915c" containerName="keystone-bootstrap" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.439529 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="41dd2191-6b9d-46d5-a9a4-53f6324a915c" containerName="keystone-bootstrap" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.455482 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.462254 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.463155 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w8gft" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.477069 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-db-sync-config-data\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.477130 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7633806b-c365-4597-b298-1e9767c640d4-etc-machine-id\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.477154 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-scripts\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.477221 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mp98\" (UniqueName: \"kubernetes.io/projected/7633806b-c365-4597-b298-1e9767c640d4-kube-api-access-8mp98\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.477241 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-config-data\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.477385 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-combined-ca-bundle\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.477473 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzr7x\" (UniqueName: \"kubernetes.io/projected/41dd2191-6b9d-46d5-a9a4-53f6324a915c-kube-api-access-hzr7x\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.477486 4772 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.477496 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.477507 4772 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.485190 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7633806b-c365-4597-b298-1e9767c640d4-etc-machine-id\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.502075 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-scripts\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.504782 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-db-sync-config-data\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.506071 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-combined-ca-bundle\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.509228 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41dd2191-6b9d-46d5-a9a4-53f6324a915c" (UID: "41dd2191-6b9d-46d5-a9a4-53f6324a915c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.509641 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-config-data\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.510210 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-config-data" (OuterVolumeSpecName: "config-data") pod "41dd2191-6b9d-46d5-a9a4-53f6324a915c" (UID: "41dd2191-6b9d-46d5-a9a4-53f6324a915c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.526386 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-556574fbcf-gsxp7"] Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.526817 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" podUID="eb1c02f4-3278-47ea-8958-945b14fe2868" containerName="dnsmasq-dns" containerID="cri-o://07645a3578e6eb48a011080c809807704a17c3736accb5af852dc82610a513aa" gracePeriod=10 Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.527785 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mp98\" (UniqueName: \"kubernetes.io/projected/7633806b-c365-4597-b298-1e9767c640d4-kube-api-access-8mp98\") pod \"cinder-db-sync-f85ns\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.569739 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wfrg6"] Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.582015 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z8h8\" (UniqueName: \"kubernetes.io/projected/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-kube-api-access-4z8h8\") pod \"barbican-db-sync-wfrg6\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.582150 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-combined-ca-bundle\") pod \"barbican-db-sync-wfrg6\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.582359 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-db-sync-config-data\") pod \"barbican-db-sync-wfrg6\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.582475 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.582550 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dd2191-6b9d-46d5-a9a4-53f6324a915c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.597124 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f85ns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.628165 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-lvdj8"] Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.630007 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.642863 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-lvdj8"] Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.648964 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6zwns" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.649185 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.660627 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.684790 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-db-sync-config-data\") pod \"barbican-db-sync-wfrg6\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.684908 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z8h8\" (UniqueName: \"kubernetes.io/projected/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-kube-api-access-4z8h8\") pod \"barbican-db-sync-wfrg6\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.684949 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-combined-ca-bundle\") pod \"barbican-db-sync-wfrg6\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.699244 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-combined-ca-bundle\") pod \"barbican-db-sync-wfrg6\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.701589 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-db-sync-config-data\") pod \"barbican-db-sync-wfrg6\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.705693 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z8h8\" (UniqueName: \"kubernetes.io/projected/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-kube-api-access-4z8h8\") pod \"barbican-db-sync-wfrg6\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.786342 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-combined-ca-bundle\") pod \"neutron-db-sync-lvdj8\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.792592 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zss5w\" (UniqueName: \"kubernetes.io/projected/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-kube-api-access-zss5w\") pod \"neutron-db-sync-lvdj8\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.792739 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-config\") pod \"neutron-db-sync-lvdj8\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.834191 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be5891a-e27f-4f51-868f-90a7ade7d4bb","Type":"ContainerStarted","Data":"e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38"} Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.840695 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mgrkk" event={"ID":"41dd2191-6b9d-46d5-a9a4-53f6324a915c","Type":"ContainerDied","Data":"55e2843d638ba480bb61046eadc351e1de27fc75bd6190eb4e3c79d0e0a759dc"} Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.840746 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55e2843d638ba480bb61046eadc351e1de27fc75bd6190eb4e3c79d0e0a759dc" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.840860 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mgrkk" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.849120 4772 generic.go:334] "Generic (PLEG): container finished" podID="eb1c02f4-3278-47ea-8958-945b14fe2868" containerID="07645a3578e6eb48a011080c809807704a17c3736accb5af852dc82610a513aa" exitCode=0 Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.849173 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" event={"ID":"eb1c02f4-3278-47ea-8958-945b14fe2868","Type":"ContainerDied","Data":"07645a3578e6eb48a011080c809807704a17c3736accb5af852dc82610a513aa"} Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.894521 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zss5w\" (UniqueName: \"kubernetes.io/projected/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-kube-api-access-zss5w\") pod \"neutron-db-sync-lvdj8\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.894593 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-config\") pod \"neutron-db-sync-lvdj8\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.894681 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-combined-ca-bundle\") pod \"neutron-db-sync-lvdj8\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.899836 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-combined-ca-bundle\") pod \"neutron-db-sync-lvdj8\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.900420 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-config\") pod \"neutron-db-sync-lvdj8\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.915343 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zss5w\" (UniqueName: \"kubernetes.io/projected/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-kube-api-access-zss5w\") pod \"neutron-db-sync-lvdj8\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.955353 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.984293 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:20:49 crc kubenswrapper[4772]: I0930 17:20:49.994072 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.124947 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 17:20:50 crc kubenswrapper[4772]: W0930 17:20:50.177636 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf22b9ce_256e_4ba4_95ba_53778c010876.slice/crio-5a21836a172459ba6036118b2266407833eb9ece17b3db11b1c24490d4c52195 WatchSource:0}: Error finding container 5a21836a172459ba6036118b2266407833eb9ece17b3db11b1c24490d4c52195: Status 404 returned error can't find the container with id 5a21836a172459ba6036118b2266407833eb9ece17b3db11b1c24490d4c52195 Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.273866 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 17:20:50 crc kubenswrapper[4772]: W0930 17:20:50.294436 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f02322_0ff1_410e_8b46_dd3b5f909963.slice/crio-ca6563aeb39d4d97913796e22e8a548f3ee487451e4879dc2f0f3628cdbf293a WatchSource:0}: Error finding container ca6563aeb39d4d97913796e22e8a548f3ee487451e4879dc2f0f3628cdbf293a: Status 404 returned error can't find the container with id ca6563aeb39d4d97913796e22e8a548f3ee487451e4879dc2f0f3628cdbf293a Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.383467 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f85ns"] Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.387673 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.404489 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mgrkk"] Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.412114 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mgrkk"] Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.506744 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xwzjp"] Sep 30 17:20:50 crc kubenswrapper[4772]: E0930 17:20:50.507214 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb1c02f4-3278-47ea-8958-945b14fe2868" containerName="dnsmasq-dns" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.507234 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb1c02f4-3278-47ea-8958-945b14fe2868" containerName="dnsmasq-dns" Sep 30 17:20:50 crc kubenswrapper[4772]: E0930 17:20:50.507275 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb1c02f4-3278-47ea-8958-945b14fe2868" containerName="init" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.507283 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb1c02f4-3278-47ea-8958-945b14fe2868" containerName="init" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.508013 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb1c02f4-3278-47ea-8958-945b14fe2868" containerName="dnsmasq-dns" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.508857 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.510789 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.511131 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-sb\") pod \"eb1c02f4-3278-47ea-8958-945b14fe2868\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.511176 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-nb\") pod \"eb1c02f4-3278-47ea-8958-945b14fe2868\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.511270 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq7td\" (UniqueName: \"kubernetes.io/projected/eb1c02f4-3278-47ea-8958-945b14fe2868-kube-api-access-lq7td\") pod \"eb1c02f4-3278-47ea-8958-945b14fe2868\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.511344 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-config\") pod \"eb1c02f4-3278-47ea-8958-945b14fe2868\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.511375 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-dns-svc\") pod \"eb1c02f4-3278-47ea-8958-945b14fe2868\" (UID: \"eb1c02f4-3278-47ea-8958-945b14fe2868\") " Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.511897 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.512130 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.513203 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fsmbb" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.520582 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb1c02f4-3278-47ea-8958-945b14fe2868-kube-api-access-lq7td" (OuterVolumeSpecName: "kube-api-access-lq7td") pod "eb1c02f4-3278-47ea-8958-945b14fe2868" (UID: "eb1c02f4-3278-47ea-8958-945b14fe2868"). InnerVolumeSpecName "kube-api-access-lq7td". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.521482 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xwzjp"] Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.531123 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq7td\" (UniqueName: \"kubernetes.io/projected/eb1c02f4-3278-47ea-8958-945b14fe2868-kube-api-access-lq7td\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.603296 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-config" (OuterVolumeSpecName: "config") pod "eb1c02f4-3278-47ea-8958-945b14fe2868" (UID: "eb1c02f4-3278-47ea-8958-945b14fe2868"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.611482 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wfrg6"] Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.618450 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-lvdj8"] Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.629891 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb1c02f4-3278-47ea-8958-945b14fe2868" (UID: "eb1c02f4-3278-47ea-8958-945b14fe2868"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.632496 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-combined-ca-bundle\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.632579 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-fernet-keys\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.632612 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-credential-keys\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.632642 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-scripts\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.632693 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-config-data\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.632737 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdjh8\" (UniqueName: \"kubernetes.io/projected/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-kube-api-access-fdjh8\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.633189 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.633394 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.637295 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb1c02f4-3278-47ea-8958-945b14fe2868" (UID: "eb1c02f4-3278-47ea-8958-945b14fe2868"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.668911 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb1c02f4-3278-47ea-8958-945b14fe2868" (UID: "eb1c02f4-3278-47ea-8958-945b14fe2868"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.736650 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdjh8\" (UniqueName: \"kubernetes.io/projected/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-kube-api-access-fdjh8\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.736755 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-combined-ca-bundle\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.736828 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-fernet-keys\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.736861 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-credential-keys\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.736899 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-scripts\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.736964 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-config-data\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.737047 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.737089 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb1c02f4-3278-47ea-8958-945b14fe2868-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.740910 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-config-data\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.745098 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-credential-keys\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.749435 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-scripts\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.749465 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-fernet-keys\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.749446 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-combined-ca-bundle\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.775485 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdjh8\" (UniqueName: \"kubernetes.io/projected/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-kube-api-access-fdjh8\") pod \"keystone-bootstrap-xwzjp\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.841193 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.870326 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"bf22b9ce-256e-4ba4-95ba-53778c010876","Type":"ContainerStarted","Data":"5a21836a172459ba6036118b2266407833eb9ece17b3db11b1c24490d4c52195"} Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.882400 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.882517 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-556574fbcf-gsxp7" event={"ID":"eb1c02f4-3278-47ea-8958-945b14fe2868","Type":"ContainerDied","Data":"e3f42dcdeb8a994de7bb4909ce1b3dac2f5369fcb37328d6fc6e20776beef443"} Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.882574 4772 scope.go:117] "RemoveContainer" containerID="07645a3578e6eb48a011080c809807704a17c3736accb5af852dc82610a513aa" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.887118 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wfrg6" event={"ID":"93e25cc4-9ac5-4e36-87b0-4523bba98b4b","Type":"ContainerStarted","Data":"f8bfea4fefe47ca6ea9bc782144f60ccca4eea41c81df9f281683e0c170d3c23"} Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.905507 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lvdj8" event={"ID":"23c54c4a-b5a3-4234-8cdc-62d55390d7c9","Type":"ContainerStarted","Data":"b809ff8bdee644a480d13823c3b8e41db257743de90fe14e1dd6e59682eea32c"} Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.910967 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9f56ca6b-b0f8-4f59-8e69-a28d900046fe","Type":"ContainerStarted","Data":"5f28763e6f4703448f60d7687ee136027f40322f9060d1476eb00386374f453a"} Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.911026 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9f56ca6b-b0f8-4f59-8e69-a28d900046fe","Type":"ContainerStarted","Data":"b2571892e8f82677b30c5221ccf5b4c310a48b4100cea3a5a57304767e1b9cec"} Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.911041 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9f56ca6b-b0f8-4f59-8e69-a28d900046fe","Type":"ContainerStarted","Data":"0a8da0cf1e142821d8253df28639eb064502a73493c9b1db25f2bdc66c03382b"} Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.911598 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.914799 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f85ns" event={"ID":"7633806b-c365-4597-b298-1e9767c640d4","Type":"ContainerStarted","Data":"5838cdfd76fc76331754a992f938e0cb8306218dc6c073658c7df445460ee976"} Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.927781 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"69f02322-0ff1-410e-8b46-dd3b5f909963","Type":"ContainerStarted","Data":"ca6563aeb39d4d97913796e22e8a548f3ee487451e4879dc2f0f3628cdbf293a"} Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.938158 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.9381386369999998 podStartE2EDuration="2.938138637s" podCreationTimestamp="2025-09-30 17:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:50.934200886 +0000 UTC m=+1151.841213717" watchObservedRunningTime="2025-09-30 17:20:50.938138637 +0000 UTC m=+1151.845151468" Sep 30 17:20:50 crc kubenswrapper[4772]: I0930 17:20:50.993119 4772 scope.go:117] "RemoveContainer" containerID="ff8328a751db53ac5bb7008f5528622bb6fd64c859175bdaf3153b2197395f1a" Sep 30 17:20:51 crc kubenswrapper[4772]: I0930 17:20:51.001934 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-556574fbcf-gsxp7"] Sep 30 17:20:51 crc kubenswrapper[4772]: I0930 17:20:51.023153 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-556574fbcf-gsxp7"] Sep 30 17:20:51 crc kubenswrapper[4772]: I0930 17:20:51.453724 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xwzjp"] Sep 30 17:20:51 crc kubenswrapper[4772]: I0930 17:20:51.914516 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41dd2191-6b9d-46d5-a9a4-53f6324a915c" path="/var/lib/kubelet/pods/41dd2191-6b9d-46d5-a9a4-53f6324a915c/volumes" Sep 30 17:20:51 crc kubenswrapper[4772]: I0930 17:20:51.915733 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb1c02f4-3278-47ea-8958-945b14fe2868" path="/var/lib/kubelet/pods/eb1c02f4-3278-47ea-8958-945b14fe2868/volumes" Sep 30 17:20:51 crc kubenswrapper[4772]: I0930 17:20:51.947313 4772 generic.go:334] "Generic (PLEG): container finished" podID="e531fe5c-574b-4894-b491-a46e9892d380" containerID="497209c744bd058efc3c7294be6072f96bcb866cda13531d017880cd7e07b1e9" exitCode=0 Sep 30 17:20:51 crc kubenswrapper[4772]: I0930 17:20:51.947594 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l59ml" event={"ID":"e531fe5c-574b-4894-b491-a46e9892d380","Type":"ContainerDied","Data":"497209c744bd058efc3c7294be6072f96bcb866cda13531d017880cd7e07b1e9"} Sep 30 17:20:51 crc kubenswrapper[4772]: I0930 17:20:51.949910 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lvdj8" event={"ID":"23c54c4a-b5a3-4234-8cdc-62d55390d7c9","Type":"ContainerStarted","Data":"4e48e248cc6d67699c0134651aa74d5261771500766df0325c489e76ce9306ef"} Sep 30 17:20:51 crc kubenswrapper[4772]: I0930 17:20:51.984983 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-lvdj8" podStartSLOduration=2.984964367 podStartE2EDuration="2.984964367s" podCreationTimestamp="2025-09-30 17:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:51.984948697 +0000 UTC m=+1152.891961528" watchObservedRunningTime="2025-09-30 17:20:51.984964367 +0000 UTC m=+1152.891977188" Sep 30 17:20:52 crc kubenswrapper[4772]: I0930 17:20:52.960922 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xwzjp" event={"ID":"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2","Type":"ContainerStarted","Data":"deb6e52c5e903919632c9bc0d45184c8fb5e8d4f19d438f801dc0bbc7d6af93b"} Sep 30 17:20:53 crc kubenswrapper[4772]: I0930 17:20:53.381768 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 17:20:53 crc kubenswrapper[4772]: I0930 17:20:53.381871 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:20:53 crc kubenswrapper[4772]: I0930 17:20:53.687014 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 17:20:54 crc kubenswrapper[4772]: I0930 17:20:54.989931 4772 generic.go:334] "Generic (PLEG): container finished" podID="523a44fe-7e63-47a7-9b9d-4e272994dce1" containerID="901f6614fabc8f4551b399a76786ff27bf1ae43ea4b295635749f22328a69dd0" exitCode=0 Sep 30 17:20:54 crc kubenswrapper[4772]: I0930 17:20:54.989991 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pwm7r" event={"ID":"523a44fe-7e63-47a7-9b9d-4e272994dce1","Type":"ContainerDied","Data":"901f6614fabc8f4551b399a76786ff27bf1ae43ea4b295635749f22328a69dd0"} Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.434226 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.520166 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-config-data\") pod \"e531fe5c-574b-4894-b491-a46e9892d380\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.520296 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-combined-ca-bundle\") pod \"e531fe5c-574b-4894-b491-a46e9892d380\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.520358 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e531fe5c-574b-4894-b491-a46e9892d380-logs\") pod \"e531fe5c-574b-4894-b491-a46e9892d380\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.520400 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9wt6\" (UniqueName: \"kubernetes.io/projected/e531fe5c-574b-4894-b491-a46e9892d380-kube-api-access-r9wt6\") pod \"e531fe5c-574b-4894-b491-a46e9892d380\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.520435 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-scripts\") pod \"e531fe5c-574b-4894-b491-a46e9892d380\" (UID: \"e531fe5c-574b-4894-b491-a46e9892d380\") " Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.521253 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e531fe5c-574b-4894-b491-a46e9892d380-logs" (OuterVolumeSpecName: "logs") pod "e531fe5c-574b-4894-b491-a46e9892d380" (UID: "e531fe5c-574b-4894-b491-a46e9892d380"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.524863 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-scripts" (OuterVolumeSpecName: "scripts") pod "e531fe5c-574b-4894-b491-a46e9892d380" (UID: "e531fe5c-574b-4894-b491-a46e9892d380"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.525487 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e531fe5c-574b-4894-b491-a46e9892d380-kube-api-access-r9wt6" (OuterVolumeSpecName: "kube-api-access-r9wt6") pod "e531fe5c-574b-4894-b491-a46e9892d380" (UID: "e531fe5c-574b-4894-b491-a46e9892d380"). InnerVolumeSpecName "kube-api-access-r9wt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.546941 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-config-data" (OuterVolumeSpecName: "config-data") pod "e531fe5c-574b-4894-b491-a46e9892d380" (UID: "e531fe5c-574b-4894-b491-a46e9892d380"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.552539 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e531fe5c-574b-4894-b491-a46e9892d380" (UID: "e531fe5c-574b-4894-b491-a46e9892d380"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.623656 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.623697 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.623713 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e531fe5c-574b-4894-b491-a46e9892d380-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.623725 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9wt6\" (UniqueName: \"kubernetes.io/projected/e531fe5c-574b-4894-b491-a46e9892d380-kube-api-access-r9wt6\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:57 crc kubenswrapper[4772]: I0930 17:20:57.623737 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e531fe5c-574b-4894-b491-a46e9892d380-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.020276 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l59ml" event={"ID":"e531fe5c-574b-4894-b491-a46e9892d380","Type":"ContainerDied","Data":"164ea7ff480b3ee9f92f720ba92d7b5364133919e5b4a66cae376e741a5ffb28"} Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.020337 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="164ea7ff480b3ee9f92f720ba92d7b5364133919e5b4a66cae376e741a5ffb28" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.020394 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l59ml" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.381609 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.391607 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.564227 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-79fbb4fcd8-68j8v"] Sep 30 17:20:58 crc kubenswrapper[4772]: E0930 17:20:58.564897 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e531fe5c-574b-4894-b491-a46e9892d380" containerName="placement-db-sync" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.564917 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e531fe5c-574b-4894-b491-a46e9892d380" containerName="placement-db-sync" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.565109 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e531fe5c-574b-4894-b491-a46e9892d380" containerName="placement-db-sync" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.566001 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.572752 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.572751 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.572911 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nhw6m" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.572911 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.573136 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.576973 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-79fbb4fcd8-68j8v"] Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.642532 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-config-data\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.642684 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs7z9\" (UniqueName: \"kubernetes.io/projected/ef6c9261-05fa-449e-87ba-2c33d858daec-kube-api-access-gs7z9\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.642778 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-public-tls-certs\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.642858 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6c9261-05fa-449e-87ba-2c33d858daec-logs\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.642884 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-scripts\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.642927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-combined-ca-bundle\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.642956 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-internal-tls-certs\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.744589 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-public-tls-certs\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.744663 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6c9261-05fa-449e-87ba-2c33d858daec-logs\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.744684 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-scripts\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.744708 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-combined-ca-bundle\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.744730 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-internal-tls-certs\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.744757 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-config-data\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.744808 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs7z9\" (UniqueName: \"kubernetes.io/projected/ef6c9261-05fa-449e-87ba-2c33d858daec-kube-api-access-gs7z9\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.746389 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6c9261-05fa-449e-87ba-2c33d858daec-logs\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.754242 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-scripts\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.754310 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-public-tls-certs\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.759088 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-config-data\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.761760 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-combined-ca-bundle\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.764164 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs7z9\" (UniqueName: \"kubernetes.io/projected/ef6c9261-05fa-449e-87ba-2c33d858daec-kube-api-access-gs7z9\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.766441 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6c9261-05fa-449e-87ba-2c33d858daec-internal-tls-certs\") pod \"placement-79fbb4fcd8-68j8v\" (UID: \"ef6c9261-05fa-449e-87ba-2c33d858daec\") " pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:58 crc kubenswrapper[4772]: I0930 17:20:58.927203 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:20:59 crc kubenswrapper[4772]: I0930 17:20:59.040436 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 17:21:01 crc kubenswrapper[4772]: I0930 17:21:01.413447 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 17:21:01 crc kubenswrapper[4772]: I0930 17:21:01.416281 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api-log" containerID="cri-o://b2571892e8f82677b30c5221ccf5b4c310a48b4100cea3a5a57304767e1b9cec" gracePeriod=30 Sep 30 17:21:01 crc kubenswrapper[4772]: I0930 17:21:01.416361 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api" containerID="cri-o://5f28763e6f4703448f60d7687ee136027f40322f9060d1476eb00386374f453a" gracePeriod=30 Sep 30 17:21:02 crc kubenswrapper[4772]: I0930 17:21:02.097083 4772 generic.go:334] "Generic (PLEG): container finished" podID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerID="b2571892e8f82677b30c5221ccf5b4c310a48b4100cea3a5a57304767e1b9cec" exitCode=143 Sep 30 17:21:02 crc kubenswrapper[4772]: I0930 17:21:02.097135 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9f56ca6b-b0f8-4f59-8e69-a28d900046fe","Type":"ContainerDied","Data":"b2571892e8f82677b30c5221ccf5b4c310a48b4100cea3a5a57304767e1b9cec"} Sep 30 17:21:03 crc kubenswrapper[4772]: I0930 17:21:03.111772 4772 generic.go:334] "Generic (PLEG): container finished" podID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerID="5f28763e6f4703448f60d7687ee136027f40322f9060d1476eb00386374f453a" exitCode=0 Sep 30 17:21:03 crc kubenswrapper[4772]: I0930 17:21:03.111924 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9f56ca6b-b0f8-4f59-8e69-a28d900046fe","Type":"ContainerDied","Data":"5f28763e6f4703448f60d7687ee136027f40322f9060d1476eb00386374f453a"} Sep 30 17:21:03 crc kubenswrapper[4772]: I0930 17:21:03.381495 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": dial tcp 10.217.0.149:9322: connect: connection refused" Sep 30 17:21:03 crc kubenswrapper[4772]: I0930 17:21:03.381521 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": dial tcp 10.217.0.149:9322: connect: connection refused" Sep 30 17:21:08 crc kubenswrapper[4772]: I0930 17:21:08.382144 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": dial tcp 10.217.0.149:9322: connect: connection refused" Sep 30 17:21:08 crc kubenswrapper[4772]: I0930 17:21:08.382145 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": dial tcp 10.217.0.149:9322: connect: connection refused" Sep 30 17:21:08 crc kubenswrapper[4772]: I0930 17:21:08.656038 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:21:08 crc kubenswrapper[4772]: I0930 17:21:08.656355 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:21:11 crc kubenswrapper[4772]: E0930 17:21:11.319622 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Sep 30 17:21:11 crc kubenswrapper[4772]: E0930 17:21:11.320350 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m4r2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1be5891a-e27f-4f51-868f-90a7ade7d4bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.373608 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pwm7r" Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.429636 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-db-sync-config-data\") pod \"523a44fe-7e63-47a7-9b9d-4e272994dce1\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.429765 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzz9r\" (UniqueName: \"kubernetes.io/projected/523a44fe-7e63-47a7-9b9d-4e272994dce1-kube-api-access-qzz9r\") pod \"523a44fe-7e63-47a7-9b9d-4e272994dce1\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.429849 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-combined-ca-bundle\") pod \"523a44fe-7e63-47a7-9b9d-4e272994dce1\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.429926 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-config-data\") pod \"523a44fe-7e63-47a7-9b9d-4e272994dce1\" (UID: \"523a44fe-7e63-47a7-9b9d-4e272994dce1\") " Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.439302 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "523a44fe-7e63-47a7-9b9d-4e272994dce1" (UID: "523a44fe-7e63-47a7-9b9d-4e272994dce1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.439369 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/523a44fe-7e63-47a7-9b9d-4e272994dce1-kube-api-access-qzz9r" (OuterVolumeSpecName: "kube-api-access-qzz9r") pod "523a44fe-7e63-47a7-9b9d-4e272994dce1" (UID: "523a44fe-7e63-47a7-9b9d-4e272994dce1"). InnerVolumeSpecName "kube-api-access-qzz9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.515235 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "523a44fe-7e63-47a7-9b9d-4e272994dce1" (UID: "523a44fe-7e63-47a7-9b9d-4e272994dce1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.527729 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-config-data" (OuterVolumeSpecName: "config-data") pod "523a44fe-7e63-47a7-9b9d-4e272994dce1" (UID: "523a44fe-7e63-47a7-9b9d-4e272994dce1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.532671 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.532707 4772 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.532719 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzz9r\" (UniqueName: \"kubernetes.io/projected/523a44fe-7e63-47a7-9b9d-4e272994dce1-kube-api-access-qzz9r\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:11 crc kubenswrapper[4772]: I0930 17:21:11.532728 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523a44fe-7e63-47a7-9b9d-4e272994dce1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.211870 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pwm7r" event={"ID":"523a44fe-7e63-47a7-9b9d-4e272994dce1","Type":"ContainerDied","Data":"dca38f6bf43cfe0eb347b6387b76b9ce3833c24f822e264a638aec39082f8d05"} Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.211924 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dca38f6bf43cfe0eb347b6387b76b9ce3833c24f822e264a638aec39082f8d05" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.212003 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pwm7r" Sep 30 17:21:12 crc kubenswrapper[4772]: E0930 17:21:12.515659 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Sep 30 17:21:12 crc kubenswrapper[4772]: E0930 17:21:12.516260 4772 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Sep 30 17:21:12 crc kubenswrapper[4772]: E0930 17:21:12.516471 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.129.56.221:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8mp98,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-f85ns_openstack(7633806b-c365-4597-b298-1e9767c640d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:21:12 crc kubenswrapper[4772]: E0930 17:21:12.517796 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-f85ns" podUID="7633806b-c365-4597-b298-1e9767c640d4" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.862397 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ccbf6dc95-58jkq"] Sep 30 17:21:12 crc kubenswrapper[4772]: E0930 17:21:12.862960 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523a44fe-7e63-47a7-9b9d-4e272994dce1" containerName="glance-db-sync" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.862982 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="523a44fe-7e63-47a7-9b9d-4e272994dce1" containerName="glance-db-sync" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.863353 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="523a44fe-7e63-47a7-9b9d-4e272994dce1" containerName="glance-db-sync" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.866493 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.884020 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccbf6dc95-58jkq"] Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.964525 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvzp4\" (UniqueName: \"kubernetes.io/projected/a13745ce-d903-4a06-a020-2095be2c3e55-kube-api-access-cvzp4\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.964600 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-dns-svc\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.964629 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-sb\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.964655 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-config\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:12 crc kubenswrapper[4772]: I0930 17:21:12.964738 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-nb\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.068183 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvzp4\" (UniqueName: \"kubernetes.io/projected/a13745ce-d903-4a06-a020-2095be2c3e55-kube-api-access-cvzp4\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.068773 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-dns-svc\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.069974 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-sb\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.068808 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-sb\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.069984 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-dns-svc\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.070096 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-config\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.070205 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-nb\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.070684 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-config\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.070908 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-nb\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.087921 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvzp4\" (UniqueName: \"kubernetes.io/projected/a13745ce-d903-4a06-a020-2095be2c3e55-kube-api-access-cvzp4\") pod \"dnsmasq-dns-ccbf6dc95-58jkq\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.205179 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:13 crc kubenswrapper[4772]: E0930 17:21:13.225932 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.221:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-f85ns" podUID="7633806b-c365-4597-b298-1e9767c640d4" Sep 30 17:21:13 crc kubenswrapper[4772]: E0930 17:21:13.461439 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Sep 30 17:21:13 crc kubenswrapper[4772]: E0930 17:21:13.461487 4772 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Sep 30 17:21:13 crc kubenswrapper[4772]: E0930 17:21:13.461595 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.129.56.221:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4z8h8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wfrg6_openstack(93e25cc4-9ac5-4e36-87b0-4523bba98b4b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:21:13 crc kubenswrapper[4772]: E0930 17:21:13.462821 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wfrg6" podUID="93e25cc4-9ac5-4e36-87b0-4523bba98b4b" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.819381 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.887508 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-combined-ca-bundle\") pod \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.887717 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-logs\") pod \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.887850 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd4xk\" (UniqueName: \"kubernetes.io/projected/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-kube-api-access-hd4xk\") pod \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.887909 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-config-data\") pod \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.887932 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-custom-prometheus-ca\") pod \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\" (UID: \"9f56ca6b-b0f8-4f59-8e69-a28d900046fe\") " Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.890409 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-logs" (OuterVolumeSpecName: "logs") pod "9f56ca6b-b0f8-4f59-8e69-a28d900046fe" (UID: "9f56ca6b-b0f8-4f59-8e69-a28d900046fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.902428 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-kube-api-access-hd4xk" (OuterVolumeSpecName: "kube-api-access-hd4xk") pod "9f56ca6b-b0f8-4f59-8e69-a28d900046fe" (UID: "9f56ca6b-b0f8-4f59-8e69-a28d900046fe"). InnerVolumeSpecName "kube-api-access-hd4xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.992006 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:13 crc kubenswrapper[4772]: I0930 17:21:13.992378 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd4xk\" (UniqueName: \"kubernetes.io/projected/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-kube-api-access-hd4xk\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.041859 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f56ca6b-b0f8-4f59-8e69-a28d900046fe" (UID: "9f56ca6b-b0f8-4f59-8e69-a28d900046fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.074102 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9f56ca6b-b0f8-4f59-8e69-a28d900046fe" (UID: "9f56ca6b-b0f8-4f59-8e69-a28d900046fe"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.082587 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccbf6dc95-58jkq"] Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.086097 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-config-data" (OuterVolumeSpecName: "config-data") pod "9f56ca6b-b0f8-4f59-8e69-a28d900046fe" (UID: "9f56ca6b-b0f8-4f59-8e69-a28d900046fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.095213 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.095253 4772 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.095284 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f56ca6b-b0f8-4f59-8e69-a28d900046fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:14 crc kubenswrapper[4772]: W0930 17:21:14.095911 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda13745ce_d903_4a06_a020_2095be2c3e55.slice/crio-172ce5b9832b8557654eb5391204d7d6bcbe10c163804f66ceffb3366cc6a900 WatchSource:0}: Error finding container 172ce5b9832b8557654eb5391204d7d6bcbe10c163804f66ceffb3366cc6a900: Status 404 returned error can't find the container with id 172ce5b9832b8557654eb5391204d7d6bcbe10c163804f66ceffb3366cc6a900 Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.206279 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-79fbb4fcd8-68j8v"] Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.258867 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9f56ca6b-b0f8-4f59-8e69-a28d900046fe","Type":"ContainerDied","Data":"0a8da0cf1e142821d8253df28639eb064502a73493c9b1db25f2bdc66c03382b"} Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.258921 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.258961 4772 scope.go:117] "RemoveContainer" containerID="5f28763e6f4703448f60d7687ee136027f40322f9060d1476eb00386374f453a" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.269473 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xwzjp" event={"ID":"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2","Type":"ContainerStarted","Data":"bf446099bc0634fd206503e4ecbd53eac5f303a6d83ddda96c06c4b8ae0b2974"} Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.276327 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79fbb4fcd8-68j8v" event={"ID":"ef6c9261-05fa-449e-87ba-2c33d858daec","Type":"ContainerStarted","Data":"b18b835b213f01eba790dd591190b4f62f7c02c5a02179ba825fe02de783ca7a"} Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.284077 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"69f02322-0ff1-410e-8b46-dd3b5f909963","Type":"ContainerStarted","Data":"8d19337279315d1891c15d4b23002d6091d82b7dc6b7501c5bc2e6f7e8dd0d32"} Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.293463 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"bf22b9ce-256e-4ba4-95ba-53778c010876","Type":"ContainerStarted","Data":"74b228ea9f2fe9851bde94e427940d4968f71d694275ef1341bf90643ec316c2"} Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.301288 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" event={"ID":"a13745ce-d903-4a06-a020-2095be2c3e55","Type":"ContainerStarted","Data":"172ce5b9832b8557654eb5391204d7d6bcbe10c163804f66ceffb3366cc6a900"} Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.309796 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xwzjp" podStartSLOduration=24.308505799 podStartE2EDuration="24.308505799s" podCreationTimestamp="2025-09-30 17:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:14.291047527 +0000 UTC m=+1175.198060358" watchObservedRunningTime="2025-09-30 17:21:14.308505799 +0000 UTC m=+1175.215518650" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.317385 4772 scope.go:117] "RemoveContainer" containerID="b2571892e8f82677b30c5221ccf5b4c310a48b4100cea3a5a57304767e1b9cec" Sep 30 17:21:14 crc kubenswrapper[4772]: E0930 17:21:14.317678 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.221:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-wfrg6" podUID="93e25cc4-9ac5-4e36-87b0-4523bba98b4b" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.345892 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.210806326 podStartE2EDuration="26.345862304s" podCreationTimestamp="2025-09-30 17:20:48 +0000 UTC" firstStartedPulling="2025-09-30 17:20:50.326396394 +0000 UTC m=+1151.233409225" lastFinishedPulling="2025-09-30 17:21:13.461452372 +0000 UTC m=+1174.368465203" observedRunningTime="2025-09-30 17:21:14.317866651 +0000 UTC m=+1175.224879482" watchObservedRunningTime="2025-09-30 17:21:14.345862304 +0000 UTC m=+1175.252875135" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.392024 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.404883 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=4.136827146 podStartE2EDuration="26.404854229s" podCreationTimestamp="2025-09-30 17:20:48 +0000 UTC" firstStartedPulling="2025-09-30 17:20:50.182161438 +0000 UTC m=+1151.089174269" lastFinishedPulling="2025-09-30 17:21:12.450188521 +0000 UTC m=+1173.357201352" observedRunningTime="2025-09-30 17:21:14.35730126 +0000 UTC m=+1175.264314091" watchObservedRunningTime="2025-09-30 17:21:14.404854229 +0000 UTC m=+1175.311867060" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.412475 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.421485 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 17:21:14 crc kubenswrapper[4772]: E0930 17:21:14.435787 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.435833 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api" Sep 30 17:21:14 crc kubenswrapper[4772]: E0930 17:21:14.435879 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api-log" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.435887 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api-log" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.436144 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api-log" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.436161 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.437447 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.440448 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.441758 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.441960 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.462768 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.628320 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-logs\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.630546 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.630811 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-public-tls-certs\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.631030 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-config-data\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.631255 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.631480 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dshg5\" (UniqueName: \"kubernetes.io/projected/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-kube-api-access-dshg5\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.631700 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.735224 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.735327 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dshg5\" (UniqueName: \"kubernetes.io/projected/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-kube-api-access-dshg5\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.735367 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.735429 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-logs\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.735502 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.735526 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-public-tls-certs\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.735602 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-config-data\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.736740 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-logs\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.740774 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-config-data\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.743626 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.743760 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.746546 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.748097 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-public-tls-certs\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.763888 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dshg5\" (UniqueName: \"kubernetes.io/projected/a145ab07-1aa7-42d9-9ff7-83f68417fa0e-kube-api-access-dshg5\") pod \"watcher-api-0\" (UID: \"a145ab07-1aa7-42d9-9ff7-83f68417fa0e\") " pod="openstack/watcher-api-0" Sep 30 17:21:14 crc kubenswrapper[4772]: I0930 17:21:14.766165 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 17:21:15 crc kubenswrapper[4772]: I0930 17:21:15.301584 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 17:21:15 crc kubenswrapper[4772]: I0930 17:21:15.321703 4772 generic.go:334] "Generic (PLEG): container finished" podID="a13745ce-d903-4a06-a020-2095be2c3e55" containerID="5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e" exitCode=0 Sep 30 17:21:15 crc kubenswrapper[4772]: I0930 17:21:15.321817 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" event={"ID":"a13745ce-d903-4a06-a020-2095be2c3e55","Type":"ContainerDied","Data":"5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e"} Sep 30 17:21:15 crc kubenswrapper[4772]: I0930 17:21:15.327481 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79fbb4fcd8-68j8v" event={"ID":"ef6c9261-05fa-449e-87ba-2c33d858daec","Type":"ContainerStarted","Data":"519c6f1dd9a6a21037b05269f5ad81ca985acfdba48e8927753dcb926cde33bd"} Sep 30 17:21:15 crc kubenswrapper[4772]: I0930 17:21:15.327566 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79fbb4fcd8-68j8v" event={"ID":"ef6c9261-05fa-449e-87ba-2c33d858daec","Type":"ContainerStarted","Data":"f67475c31208f796d8e19a8aacd73513f9e997d285f0249bdd89ba49bb6bac45"} Sep 30 17:21:15 crc kubenswrapper[4772]: I0930 17:21:15.327902 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:21:15 crc kubenswrapper[4772]: I0930 17:21:15.327928 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:21:15 crc kubenswrapper[4772]: W0930 17:21:15.328687 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda145ab07_1aa7_42d9_9ff7_83f68417fa0e.slice/crio-5f086c17afebafce8edcfdc820a91dbc9df4813fa8b9a26dce833d223218a1a3 WatchSource:0}: Error finding container 5f086c17afebafce8edcfdc820a91dbc9df4813fa8b9a26dce833d223218a1a3: Status 404 returned error can't find the container with id 5f086c17afebafce8edcfdc820a91dbc9df4813fa8b9a26dce833d223218a1a3 Sep 30 17:21:15 crc kubenswrapper[4772]: I0930 17:21:15.375443 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-79fbb4fcd8-68j8v" podStartSLOduration=17.375418928 podStartE2EDuration="17.375418928s" podCreationTimestamp="2025-09-30 17:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:15.364628119 +0000 UTC m=+1176.271640950" watchObservedRunningTime="2025-09-30 17:21:15.375418928 +0000 UTC m=+1176.282431759" Sep 30 17:21:15 crc kubenswrapper[4772]: I0930 17:21:15.918317 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" path="/var/lib/kubelet/pods/9f56ca6b-b0f8-4f59-8e69-a28d900046fe/volumes" Sep 30 17:21:16 crc kubenswrapper[4772]: I0930 17:21:16.338652 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a145ab07-1aa7-42d9-9ff7-83f68417fa0e","Type":"ContainerStarted","Data":"5f086c17afebafce8edcfdc820a91dbc9df4813fa8b9a26dce833d223218a1a3"} Sep 30 17:21:18 crc kubenswrapper[4772]: I0930 17:21:18.384360 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 17:21:18 crc kubenswrapper[4772]: I0930 17:21:18.384469 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9f56ca6b-b0f8-4f59-8e69-a28d900046fe" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 17:21:18 crc kubenswrapper[4772]: I0930 17:21:18.455313 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Sep 30 17:21:18 crc kubenswrapper[4772]: I0930 17:21:18.455375 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Sep 30 17:21:18 crc kubenswrapper[4772]: I0930 17:21:18.484429 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Sep 30 17:21:18 crc kubenswrapper[4772]: I0930 17:21:18.534090 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 17:21:18 crc kubenswrapper[4772]: I0930 17:21:18.534142 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 17:21:18 crc kubenswrapper[4772]: I0930 17:21:18.567452 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 17:21:19 crc kubenswrapper[4772]: I0930 17:21:19.373107 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" event={"ID":"a13745ce-d903-4a06-a020-2095be2c3e55","Type":"ContainerStarted","Data":"b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013"} Sep 30 17:21:19 crc kubenswrapper[4772]: I0930 17:21:19.373252 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:19 crc kubenswrapper[4772]: I0930 17:21:19.375238 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a145ab07-1aa7-42d9-9ff7-83f68417fa0e","Type":"ContainerStarted","Data":"79638c32c930cdbeb87ae060c5dc65a5e18564640d1c9fb6718365ea4423b2b5"} Sep 30 17:21:19 crc kubenswrapper[4772]: I0930 17:21:19.401421 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" podStartSLOduration=7.4013988600000005 podStartE2EDuration="7.40139886s" podCreationTimestamp="2025-09-30 17:21:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:19.395031825 +0000 UTC m=+1180.302044656" watchObservedRunningTime="2025-09-30 17:21:19.40139886 +0000 UTC m=+1180.308411711" Sep 30 17:21:19 crc kubenswrapper[4772]: I0930 17:21:19.411145 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Sep 30 17:21:19 crc kubenswrapper[4772]: I0930 17:21:19.417813 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 17:21:20 crc kubenswrapper[4772]: I0930 17:21:20.409784 4772 generic.go:334] "Generic (PLEG): container finished" podID="40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" containerID="bf446099bc0634fd206503e4ecbd53eac5f303a6d83ddda96c06c4b8ae0b2974" exitCode=0 Sep 30 17:21:20 crc kubenswrapper[4772]: I0930 17:21:20.410246 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xwzjp" event={"ID":"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2","Type":"ContainerDied","Data":"bf446099bc0634fd206503e4ecbd53eac5f303a6d83ddda96c06c4b8ae0b2974"} Sep 30 17:21:20 crc kubenswrapper[4772]: I0930 17:21:20.415162 4772 generic.go:334] "Generic (PLEG): container finished" podID="69f02322-0ff1-410e-8b46-dd3b5f909963" containerID="8d19337279315d1891c15d4b23002d6091d82b7dc6b7501c5bc2e6f7e8dd0d32" exitCode=1 Sep 30 17:21:20 crc kubenswrapper[4772]: I0930 17:21:20.415977 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"69f02322-0ff1-410e-8b46-dd3b5f909963","Type":"ContainerDied","Data":"8d19337279315d1891c15d4b23002d6091d82b7dc6b7501c5bc2e6f7e8dd0d32"} Sep 30 17:21:20 crc kubenswrapper[4772]: I0930 17:21:20.416300 4772 scope.go:117] "RemoveContainer" containerID="8d19337279315d1891c15d4b23002d6091d82b7dc6b7501c5bc2e6f7e8dd0d32" Sep 30 17:21:21 crc kubenswrapper[4772]: I0930 17:21:21.464137 4772 generic.go:334] "Generic (PLEG): container finished" podID="23c54c4a-b5a3-4234-8cdc-62d55390d7c9" containerID="4e48e248cc6d67699c0134651aa74d5261771500766df0325c489e76ce9306ef" exitCode=0 Sep 30 17:21:21 crc kubenswrapper[4772]: I0930 17:21:21.464225 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lvdj8" event={"ID":"23c54c4a-b5a3-4234-8cdc-62d55390d7c9","Type":"ContainerDied","Data":"4e48e248cc6d67699c0134651aa74d5261771500766df0325c489e76ce9306ef"} Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.706347 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.812355 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-combined-ca-bundle\") pod \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.812488 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-credential-keys\") pod \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.812541 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdjh8\" (UniqueName: \"kubernetes.io/projected/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-kube-api-access-fdjh8\") pod \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.812637 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-config-data\") pod \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.813274 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-fernet-keys\") pod \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.813400 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-scripts\") pod \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\" (UID: \"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2\") " Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.816763 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" (UID: "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.817024 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-kube-api-access-fdjh8" (OuterVolumeSpecName: "kube-api-access-fdjh8") pod "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" (UID: "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2"). InnerVolumeSpecName "kube-api-access-fdjh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.817940 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" (UID: "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.819058 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-scripts" (OuterVolumeSpecName: "scripts") pod "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" (UID: "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.843650 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-config-data" (OuterVolumeSpecName: "config-data") pod "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" (UID: "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.844931 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" (UID: "40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.896925 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.917438 4772 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.918131 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdjh8\" (UniqueName: \"kubernetes.io/projected/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-kube-api-access-fdjh8\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.918145 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.918163 4772 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.918175 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:22 crc kubenswrapper[4772]: I0930 17:21:22.918187 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.019133 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-config\") pod \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.019672 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zss5w\" (UniqueName: \"kubernetes.io/projected/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-kube-api-access-zss5w\") pod \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.019834 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-combined-ca-bundle\") pod \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\" (UID: \"23c54c4a-b5a3-4234-8cdc-62d55390d7c9\") " Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.023628 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-kube-api-access-zss5w" (OuterVolumeSpecName: "kube-api-access-zss5w") pod "23c54c4a-b5a3-4234-8cdc-62d55390d7c9" (UID: "23c54c4a-b5a3-4234-8cdc-62d55390d7c9"). InnerVolumeSpecName "kube-api-access-zss5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:23 crc kubenswrapper[4772]: E0930 17:21:23.029197 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.048443 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23c54c4a-b5a3-4234-8cdc-62d55390d7c9" (UID: "23c54c4a-b5a3-4234-8cdc-62d55390d7c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.048486 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-config" (OuterVolumeSpecName: "config") pod "23c54c4a-b5a3-4234-8cdc-62d55390d7c9" (UID: "23c54c4a-b5a3-4234-8cdc-62d55390d7c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.123619 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.123691 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.123713 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zss5w\" (UniqueName: \"kubernetes.io/projected/23c54c4a-b5a3-4234-8cdc-62d55390d7c9-kube-api-access-zss5w\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.207230 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.274643 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-769468b997-d9swz"] Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.274880 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-769468b997-d9swz" podUID="bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" containerName="dnsmasq-dns" containerID="cri-o://35947ac0a483794405a47f2bfd42a5c79f0152e322d704968d6e1a38c279aa24" gracePeriod=10 Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.496511 4772 generic.go:334] "Generic (PLEG): container finished" podID="bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" containerID="35947ac0a483794405a47f2bfd42a5c79f0152e322d704968d6e1a38c279aa24" exitCode=0 Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.496575 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-769468b997-d9swz" event={"ID":"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f","Type":"ContainerDied","Data":"35947ac0a483794405a47f2bfd42a5c79f0152e322d704968d6e1a38c279aa24"} Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.498679 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a145ab07-1aa7-42d9-9ff7-83f68417fa0e","Type":"ContainerStarted","Data":"ec8d253d7a140cc6bfd27d6745e47cf492f701c211b4eec0af1c34f346686296"} Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.499775 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.510800 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lvdj8" event={"ID":"23c54c4a-b5a3-4234-8cdc-62d55390d7c9","Type":"ContainerDied","Data":"b809ff8bdee644a480d13823c3b8e41db257743de90fe14e1dd6e59682eea32c"} Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.510839 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b809ff8bdee644a480d13823c3b8e41db257743de90fe14e1dd6e59682eea32c" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.510896 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lvdj8" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.516741 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xwzjp" event={"ID":"40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2","Type":"ContainerDied","Data":"deb6e52c5e903919632c9bc0d45184c8fb5e8d4f19d438f801dc0bbc7d6af93b"} Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.516788 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deb6e52c5e903919632c9bc0d45184c8fb5e8d4f19d438f801dc0bbc7d6af93b" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.516860 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xwzjp" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.521621 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"69f02322-0ff1-410e-8b46-dd3b5f909963","Type":"ContainerStarted","Data":"70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28"} Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.523703 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be5891a-e27f-4f51-868f-90a7ade7d4bb","Type":"ContainerStarted","Data":"421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2"} Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.523841 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="ceilometer-central-agent" containerID="cri-o://55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7" gracePeriod=30 Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.524826 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.524888 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="ceilometer-notification-agent" containerID="cri-o://e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38" gracePeriod=30 Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.524981 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="proxy-httpd" containerID="cri-o://421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2" gracePeriod=30 Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.534929 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=9.53491333 podStartE2EDuration="9.53491333s" podCreationTimestamp="2025-09-30 17:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:23.531463091 +0000 UTC m=+1184.438475922" watchObservedRunningTime="2025-09-30 17:21:23.53491333 +0000 UTC m=+1184.441926161" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.712786 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7974f5645c-fp6cj"] Sep 30 17:21:23 crc kubenswrapper[4772]: E0930 17:21:23.713266 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c54c4a-b5a3-4234-8cdc-62d55390d7c9" containerName="neutron-db-sync" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.713281 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c54c4a-b5a3-4234-8cdc-62d55390d7c9" containerName="neutron-db-sync" Sep 30 17:21:23 crc kubenswrapper[4772]: E0930 17:21:23.713295 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" containerName="keystone-bootstrap" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.713301 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" containerName="keystone-bootstrap" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.713530 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c54c4a-b5a3-4234-8cdc-62d55390d7c9" containerName="neutron-db-sync" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.713558 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" containerName="keystone-bootstrap" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.714876 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.732006 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7974f5645c-fp6cj"] Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.844368 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5968d57d6b-kr75b"] Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.844411 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-sb\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.848274 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-dns-svc\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.848447 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.848577 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-config\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.848655 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-nb\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.848832 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gqtv\" (UniqueName: \"kubernetes.io/projected/01ee93d8-ee88-45b4-b28a-9446a95be4e0-kube-api-access-9gqtv\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.857544 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.858229 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.858721 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.858907 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6zwns" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.882556 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5968d57d6b-kr75b"] Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.930857 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-76ff4c9cf5-7gpvg"] Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.939694 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.943148 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.946250 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.946441 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fsmbb" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.946677 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.946849 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.946967 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.954516 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-config\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.954690 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-combined-ca-bundle\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.954736 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-nb\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.954808 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gqtv\" (UniqueName: \"kubernetes.io/projected/01ee93d8-ee88-45b4-b28a-9446a95be4e0-kube-api-access-9gqtv\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.954827 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-config\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.954908 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-sb\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.954963 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znknl\" (UniqueName: \"kubernetes.io/projected/3b66dba4-4b8a-4340-97d1-f6c995748763-kube-api-access-znknl\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.954981 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-httpd-config\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.955015 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-dns-svc\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.955038 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-ovndb-tls-certs\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.957005 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-config\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.957387 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-sb\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.958438 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-dns-svc\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.959702 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-nb\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.977008 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76ff4c9cf5-7gpvg"] Sep 30 17:21:23 crc kubenswrapper[4772]: I0930 17:21:23.993662 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gqtv\" (UniqueName: \"kubernetes.io/projected/01ee93d8-ee88-45b4-b28a-9446a95be4e0-kube-api-access-9gqtv\") pod \"dnsmasq-dns-7974f5645c-fp6cj\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.059677 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-combined-ca-bundle\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.059726 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-public-tls-certs\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.059840 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-internal-tls-certs\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.059880 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znknl\" (UniqueName: \"kubernetes.io/projected/3b66dba4-4b8a-4340-97d1-f6c995748763-kube-api-access-znknl\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.059904 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-httpd-config\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.059943 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-ovndb-tls-certs\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.059985 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-fernet-keys\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.060017 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-credential-keys\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.060040 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-combined-ca-bundle\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.060082 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-scripts\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.060105 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lrqn\" (UniqueName: \"kubernetes.io/projected/43dbf436-1404-454d-ab9a-870ba144ade3-kube-api-access-8lrqn\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.060149 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-config-data\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.060175 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-config\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.066068 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-httpd-config\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.071995 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-ovndb-tls-certs\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.073937 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-config\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.076178 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-combined-ca-bundle\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.092517 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znknl\" (UniqueName: \"kubernetes.io/projected/3b66dba4-4b8a-4340-97d1-f6c995748763-kube-api-access-znknl\") pod \"neutron-5968d57d6b-kr75b\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.098529 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.131713 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.162417 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-dns-svc\") pod \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.162536 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-nb\") pod \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.162713 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-sb\") pod \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.162828 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-config\") pod \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.162977 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qswgc\" (UniqueName: \"kubernetes.io/projected/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-kube-api-access-qswgc\") pod \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\" (UID: \"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f\") " Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.163358 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-config-data\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.163415 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-combined-ca-bundle\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.163431 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-public-tls-certs\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.163867 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-internal-tls-certs\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.163927 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-fernet-keys\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.163956 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-credential-keys\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.163985 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-scripts\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.164007 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lrqn\" (UniqueName: \"kubernetes.io/projected/43dbf436-1404-454d-ab9a-870ba144ade3-kube-api-access-8lrqn\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.180557 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-public-tls-certs\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.183136 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.184409 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-fernet-keys\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.188501 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-kube-api-access-qswgc" (OuterVolumeSpecName: "kube-api-access-qswgc") pod "bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" (UID: "bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f"). InnerVolumeSpecName "kube-api-access-qswgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.188786 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-credential-keys\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.188802 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-combined-ca-bundle\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.188844 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-scripts\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.189534 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-config-data\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.192968 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dbf436-1404-454d-ab9a-870ba144ade3-internal-tls-certs\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.215754 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lrqn\" (UniqueName: \"kubernetes.io/projected/43dbf436-1404-454d-ab9a-870ba144ade3-kube-api-access-8lrqn\") pod \"keystone-76ff4c9cf5-7gpvg\" (UID: \"43dbf436-1404-454d-ab9a-870ba144ade3\") " pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.266528 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qswgc\" (UniqueName: \"kubernetes.io/projected/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-kube-api-access-qswgc\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.305577 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-config" (OuterVolumeSpecName: "config") pod "bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" (UID: "bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.307856 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" (UID: "bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.320447 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" (UID: "bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.336619 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" (UID: "bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.368641 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.368679 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.368690 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.368699 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.412564 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.555927 4772 generic.go:334] "Generic (PLEG): container finished" podID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerID="421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2" exitCode=0 Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.556004 4772 generic.go:334] "Generic (PLEG): container finished" podID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerID="55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7" exitCode=0 Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.556079 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be5891a-e27f-4f51-868f-90a7ade7d4bb","Type":"ContainerDied","Data":"421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2"} Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.556109 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be5891a-e27f-4f51-868f-90a7ade7d4bb","Type":"ContainerDied","Data":"55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7"} Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.565694 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-769468b997-d9swz" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.566133 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-769468b997-d9swz" event={"ID":"bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f","Type":"ContainerDied","Data":"58a23f12b3d4c938d46275caee36fd116ec8fa07e06766f6fdbd8dc3fa0ff19c"} Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.566178 4772 scope.go:117] "RemoveContainer" containerID="35947ac0a483794405a47f2bfd42a5c79f0152e322d704968d6e1a38c279aa24" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.598882 4772 scope.go:117] "RemoveContainer" containerID="6e6daa3966de40c8e5286931751cadbf06231f13e908509b97fba5a921203603" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.615666 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-769468b997-d9swz"] Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.623298 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-769468b997-d9swz"] Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.768128 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.768175 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 17:21:24 crc kubenswrapper[4772]: I0930 17:21:24.832802 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7974f5645c-fp6cj"] Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.174048 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5968d57d6b-kr75b"] Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.208093 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76ff4c9cf5-7gpvg"] Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.585781 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5968d57d6b-kr75b" event={"ID":"3b66dba4-4b8a-4340-97d1-f6c995748763","Type":"ContainerStarted","Data":"4fa4ec9133bcf60c195f54bfff00ccce3018497f59fcd905fbccde491112e8bb"} Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.593244 4772 generic.go:334] "Generic (PLEG): container finished" podID="01ee93d8-ee88-45b4-b28a-9446a95be4e0" containerID="4baa482295d82285360960fd3490435046dd55d7ecb5ec153e282315df0b350a" exitCode=0 Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.593379 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" event={"ID":"01ee93d8-ee88-45b4-b28a-9446a95be4e0","Type":"ContainerDied","Data":"4baa482295d82285360960fd3490435046dd55d7ecb5ec153e282315df0b350a"} Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.593429 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" event={"ID":"01ee93d8-ee88-45b4-b28a-9446a95be4e0","Type":"ContainerStarted","Data":"9369b98fcdf6de225e6881bcaa6cb24fc091e6d1b8b607f56ed12c0441b88294"} Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.603097 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76ff4c9cf5-7gpvg" event={"ID":"43dbf436-1404-454d-ab9a-870ba144ade3","Type":"ContainerStarted","Data":"ed7eef2db0ecff867f0f30900e9b8034ad46a1160d9508bc81ddd3501dd58bba"} Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.630885 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f85ns" event={"ID":"7633806b-c365-4597-b298-1e9767c640d4","Type":"ContainerStarted","Data":"105254529cd8396980d60915c8aa90f7066b2c99b6235d19bead5afcd5b5e448"} Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.630935 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.729789 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-f85ns" podStartSLOduration=3.1657733 podStartE2EDuration="36.729750257s" podCreationTimestamp="2025-09-30 17:20:49 +0000 UTC" firstStartedPulling="2025-09-30 17:20:50.434413604 +0000 UTC m=+1151.341426435" lastFinishedPulling="2025-09-30 17:21:23.998390561 +0000 UTC m=+1184.905403392" observedRunningTime="2025-09-30 17:21:25.662036207 +0000 UTC m=+1186.569049058" watchObservedRunningTime="2025-09-30 17:21:25.729750257 +0000 UTC m=+1186.636763088" Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.772435 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="a145ab07-1aa7-42d9-9ff7-83f68417fa0e" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.158:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:21:25 crc kubenswrapper[4772]: I0930 17:21:25.913430 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" path="/var/lib/kubelet/pods/bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f/volumes" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.645750 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" event={"ID":"01ee93d8-ee88-45b4-b28a-9446a95be4e0","Type":"ContainerStarted","Data":"dcd3f39d0c024c9e0caf4846678ea0353a772ab1a1e46917a832c3239a45ee5f"} Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.646545 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.647422 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76ff4c9cf5-7gpvg" event={"ID":"43dbf436-1404-454d-ab9a-870ba144ade3","Type":"ContainerStarted","Data":"90098bd731b3bc912644b301eba39f3bb85173bbf7b5c9d84481594b970695ae"} Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.647529 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.649044 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.649722 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5968d57d6b-kr75b" event={"ID":"3b66dba4-4b8a-4340-97d1-f6c995748763","Type":"ContainerStarted","Data":"52e6b764f4ed2804f0973e9b20adaf67a7c262b62645628878c16351d6c49f78"} Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.673969 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" podStartSLOduration=3.673941424 podStartE2EDuration="3.673941424s" podCreationTimestamp="2025-09-30 17:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:26.668308568 +0000 UTC m=+1187.575321399" watchObservedRunningTime="2025-09-30 17:21:26.673941424 +0000 UTC m=+1187.580954265" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.701864 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-76ff4c9cf5-7gpvg" podStartSLOduration=3.701834955 podStartE2EDuration="3.701834955s" podCreationTimestamp="2025-09-30 17:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:26.694557167 +0000 UTC m=+1187.601570018" watchObservedRunningTime="2025-09-30 17:21:26.701834955 +0000 UTC m=+1187.608847796" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.776387 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-684cbd44c-xstzf"] Sep 30 17:21:26 crc kubenswrapper[4772]: E0930 17:21:26.776741 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" containerName="init" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.776756 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" containerName="init" Sep 30 17:21:26 crc kubenswrapper[4772]: E0930 17:21:26.776775 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" containerName="dnsmasq-dns" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.776780 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" containerName="dnsmasq-dns" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.776985 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb7f91c-c8d7-4865-b7b9-a4b0804cb87f" containerName="dnsmasq-dns" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.778127 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.781714 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.781869 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.801992 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-684cbd44c-xstzf"] Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.842255 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-combined-ca-bundle\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.842330 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzprj\" (UniqueName: \"kubernetes.io/projected/d878293c-0383-4575-95cb-1062bcb4634e-kube-api-access-gzprj\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.842379 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-httpd-config\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.842432 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-ovndb-tls-certs\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.842461 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-public-tls-certs\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.842855 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-internal-tls-certs\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.842913 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-config\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.945099 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-ovndb-tls-certs\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.945166 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-public-tls-certs\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.945261 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-internal-tls-certs\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.945288 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-config\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.945322 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-combined-ca-bundle\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.945361 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzprj\" (UniqueName: \"kubernetes.io/projected/d878293c-0383-4575-95cb-1062bcb4634e-kube-api-access-gzprj\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.945407 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-httpd-config\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.951602 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-internal-tls-certs\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.951618 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-config\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.951712 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-combined-ca-bundle\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.951956 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-public-tls-certs\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.953394 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-ovndb-tls-certs\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.968493 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d878293c-0383-4575-95cb-1062bcb4634e-httpd-config\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:26 crc kubenswrapper[4772]: I0930 17:21:26.970435 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzprj\" (UniqueName: \"kubernetes.io/projected/d878293c-0383-4575-95cb-1062bcb4634e-kube-api-access-gzprj\") pod \"neutron-684cbd44c-xstzf\" (UID: \"d878293c-0383-4575-95cb-1062bcb4634e\") " pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:27 crc kubenswrapper[4772]: I0930 17:21:27.099383 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:27 crc kubenswrapper[4772]: I0930 17:21:27.788144 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.337233 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-684cbd44c-xstzf"] Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.533236 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 17:21:28 crc kubenswrapper[4772]: E0930 17:21:28.533856 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28 is running failed: container process not found" containerID="70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Sep 30 17:21:28 crc kubenswrapper[4772]: E0930 17:21:28.534430 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28 is running failed: container process not found" containerID="70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Sep 30 17:21:28 crc kubenswrapper[4772]: E0930 17:21:28.534818 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28 is running failed: container process not found" containerID="70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Sep 30 17:21:28 crc kubenswrapper[4772]: E0930 17:21:28.534878 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28 is running failed: container process not found" probeType="Startup" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" containerName="watcher-decision-engine" Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.672452 4772 generic.go:334] "Generic (PLEG): container finished" podID="69f02322-0ff1-410e-8b46-dd3b5f909963" containerID="70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28" exitCode=1 Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.672765 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"69f02322-0ff1-410e-8b46-dd3b5f909963","Type":"ContainerDied","Data":"70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28"} Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.673016 4772 scope.go:117] "RemoveContainer" containerID="8d19337279315d1891c15d4b23002d6091d82b7dc6b7501c5bc2e6f7e8dd0d32" Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.673614 4772 scope.go:117] "RemoveContainer" containerID="70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28" Sep 30 17:21:28 crc kubenswrapper[4772]: E0930 17:21:28.674074 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(69f02322-0ff1-410e-8b46-dd3b5f909963)\"" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.676075 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5968d57d6b-kr75b" event={"ID":"3b66dba4-4b8a-4340-97d1-f6c995748763","Type":"ContainerStarted","Data":"157aa95c6a326da39e72adbee81d1195c38929f51a9219408226f55f5ce5efa0"} Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.678448 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.681008 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wfrg6" event={"ID":"93e25cc4-9ac5-4e36-87b0-4523bba98b4b","Type":"ContainerStarted","Data":"999b8f64a3c19b21dc80511ac25737d3a9d403b481e77c769b6d864026daa8c9"} Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.682918 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-684cbd44c-xstzf" event={"ID":"d878293c-0383-4575-95cb-1062bcb4634e","Type":"ContainerStarted","Data":"b006a0e014aacbf5ce8a9113e022eb14c165c3099b7512137340ed515ebfd0ac"} Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.682959 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-684cbd44c-xstzf" event={"ID":"d878293c-0383-4575-95cb-1062bcb4634e","Type":"ContainerStarted","Data":"3f2a3392bd41eb9331f2c48ee3365f7e85e9ca62de906d2edea6e9582649569f"} Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.746823 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wfrg6" podStartSLOduration=2.073322036 podStartE2EDuration="39.746794598s" podCreationTimestamp="2025-09-30 17:20:49 +0000 UTC" firstStartedPulling="2025-09-30 17:20:50.68390945 +0000 UTC m=+1151.590922281" lastFinishedPulling="2025-09-30 17:21:28.357382012 +0000 UTC m=+1189.264394843" observedRunningTime="2025-09-30 17:21:28.713713503 +0000 UTC m=+1189.620726334" watchObservedRunningTime="2025-09-30 17:21:28.746794598 +0000 UTC m=+1189.653807449" Sep 30 17:21:28 crc kubenswrapper[4772]: I0930 17:21:28.761835 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5968d57d6b-kr75b" podStartSLOduration=5.761802086 podStartE2EDuration="5.761802086s" podCreationTimestamp="2025-09-30 17:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:28.732508979 +0000 UTC m=+1189.639521830" watchObservedRunningTime="2025-09-30 17:21:28.761802086 +0000 UTC m=+1189.668814917" Sep 30 17:21:29 crc kubenswrapper[4772]: I0930 17:21:29.698115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-684cbd44c-xstzf" event={"ID":"d878293c-0383-4575-95cb-1062bcb4634e","Type":"ContainerStarted","Data":"203eb37b75d5b26612d1d61f4fe59645b8be6d2883d58b24b1c44c9ea92747a9"} Sep 30 17:21:29 crc kubenswrapper[4772]: I0930 17:21:29.698661 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:30 crc kubenswrapper[4772]: I0930 17:21:30.244604 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:21:30 crc kubenswrapper[4772]: I0930 17:21:30.245386 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-79fbb4fcd8-68j8v" Sep 30 17:21:30 crc kubenswrapper[4772]: I0930 17:21:30.276460 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-684cbd44c-xstzf" podStartSLOduration=4.276424438 podStartE2EDuration="4.276424438s" podCreationTimestamp="2025-09-30 17:21:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:29.72678558 +0000 UTC m=+1190.633798421" watchObservedRunningTime="2025-09-30 17:21:30.276424438 +0000 UTC m=+1191.183437279" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.576762 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.681155 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-run-httpd\") pod \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.681341 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-combined-ca-bundle\") pod \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.681418 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-log-httpd\") pod \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.681693 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4r2v\" (UniqueName: \"kubernetes.io/projected/1be5891a-e27f-4f51-868f-90a7ade7d4bb-kube-api-access-m4r2v\") pod \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.681752 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-config-data\") pod \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.681871 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-scripts\") pod \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.681929 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-sg-core-conf-yaml\") pod \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\" (UID: \"1be5891a-e27f-4f51-868f-90a7ade7d4bb\") " Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.682752 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1be5891a-e27f-4f51-868f-90a7ade7d4bb" (UID: "1be5891a-e27f-4f51-868f-90a7ade7d4bb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.682902 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1be5891a-e27f-4f51-868f-90a7ade7d4bb" (UID: "1be5891a-e27f-4f51-868f-90a7ade7d4bb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.705176 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-scripts" (OuterVolumeSpecName: "scripts") pod "1be5891a-e27f-4f51-868f-90a7ade7d4bb" (UID: "1be5891a-e27f-4f51-868f-90a7ade7d4bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.705229 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be5891a-e27f-4f51-868f-90a7ade7d4bb-kube-api-access-m4r2v" (OuterVolumeSpecName: "kube-api-access-m4r2v") pod "1be5891a-e27f-4f51-868f-90a7ade7d4bb" (UID: "1be5891a-e27f-4f51-868f-90a7ade7d4bb"). InnerVolumeSpecName "kube-api-access-m4r2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.706335 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1be5891a-e27f-4f51-868f-90a7ade7d4bb" (UID: "1be5891a-e27f-4f51-868f-90a7ade7d4bb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.756108 4772 generic.go:334] "Generic (PLEG): container finished" podID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerID="e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38" exitCode=0 Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.756164 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be5891a-e27f-4f51-868f-90a7ade7d4bb","Type":"ContainerDied","Data":"e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38"} Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.756199 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be5891a-e27f-4f51-868f-90a7ade7d4bb","Type":"ContainerDied","Data":"e4b60028358e320573e578b96a01f1b04a5d1f40967c8532f3070b35bc16e2ce"} Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.756227 4772 scope.go:117] "RemoveContainer" containerID="421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.756529 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.766835 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1be5891a-e27f-4f51-868f-90a7ade7d4bb" (UID: "1be5891a-e27f-4f51-868f-90a7ade7d4bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.784403 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.784435 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.784445 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.784456 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.784465 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be5891a-e27f-4f51-868f-90a7ade7d4bb-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.784474 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4r2v\" (UniqueName: \"kubernetes.io/projected/1be5891a-e27f-4f51-868f-90a7ade7d4bb-kube-api-access-m4r2v\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.785966 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-config-data" (OuterVolumeSpecName: "config-data") pod "1be5891a-e27f-4f51-868f-90a7ade7d4bb" (UID: "1be5891a-e27f-4f51-868f-90a7ade7d4bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.804563 4772 scope.go:117] "RemoveContainer" containerID="e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.826161 4772 scope.go:117] "RemoveContainer" containerID="55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.850806 4772 scope.go:117] "RemoveContainer" containerID="421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2" Sep 30 17:21:32 crc kubenswrapper[4772]: E0930 17:21:32.851409 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2\": container with ID starting with 421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2 not found: ID does not exist" containerID="421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.851459 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2"} err="failed to get container status \"421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2\": rpc error: code = NotFound desc = could not find container \"421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2\": container with ID starting with 421e28e487df4322ca4aa2ae02964483d8146960140b3cce6dbf0f11edb848c2 not found: ID does not exist" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.851682 4772 scope.go:117] "RemoveContainer" containerID="e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38" Sep 30 17:21:32 crc kubenswrapper[4772]: E0930 17:21:32.852074 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38\": container with ID starting with e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38 not found: ID does not exist" containerID="e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.852116 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38"} err="failed to get container status \"e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38\": rpc error: code = NotFound desc = could not find container \"e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38\": container with ID starting with e9d52132f6455ccc9cd8046a002c1085ff04148f23a682bc353daedc60432a38 not found: ID does not exist" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.852143 4772 scope.go:117] "RemoveContainer" containerID="55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7" Sep 30 17:21:32 crc kubenswrapper[4772]: E0930 17:21:32.852457 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7\": container with ID starting with 55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7 not found: ID does not exist" containerID="55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.852505 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7"} err="failed to get container status \"55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7\": rpc error: code = NotFound desc = could not find container \"55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7\": container with ID starting with 55619706ab16751e912579a66ea9262bac1caa7c6406670633a43a773a9798b7 not found: ID does not exist" Sep 30 17:21:32 crc kubenswrapper[4772]: I0930 17:21:32.886279 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be5891a-e27f-4f51-868f-90a7ade7d4bb-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.168176 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.178391 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.203280 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:21:33 crc kubenswrapper[4772]: E0930 17:21:33.208998 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="ceilometer-notification-agent" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.209035 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="ceilometer-notification-agent" Sep 30 17:21:33 crc kubenswrapper[4772]: E0930 17:21:33.209050 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="proxy-httpd" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.209078 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="proxy-httpd" Sep 30 17:21:33 crc kubenswrapper[4772]: E0930 17:21:33.209094 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="ceilometer-central-agent" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.209100 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="ceilometer-central-agent" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.209276 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="proxy-httpd" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.209291 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="ceilometer-central-agent" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.209304 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" containerName="ceilometer-notification-agent" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.213273 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.216487 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.216695 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.223294 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.296963 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-log-httpd\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.297452 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-run-httpd\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.297480 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.297520 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v4sk\" (UniqueName: \"kubernetes.io/projected/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-kube-api-access-7v4sk\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.297539 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-config-data\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.297563 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.297787 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-scripts\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.400592 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-scripts\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.400831 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-log-httpd\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.400881 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-run-httpd\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.400922 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.401005 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v4sk\" (UniqueName: \"kubernetes.io/projected/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-kube-api-access-7v4sk\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.401038 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-config-data\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.401088 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.401507 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-log-httpd\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.402736 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-run-httpd\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.409957 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.410180 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-config-data\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.410692 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-scripts\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.411220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.421521 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v4sk\" (UniqueName: \"kubernetes.io/projected/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-kube-api-access-7v4sk\") pod \"ceilometer-0\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.606304 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.767283 4772 generic.go:334] "Generic (PLEG): container finished" podID="93e25cc4-9ac5-4e36-87b0-4523bba98b4b" containerID="999b8f64a3c19b21dc80511ac25737d3a9d403b481e77c769b6d864026daa8c9" exitCode=0 Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.767332 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wfrg6" event={"ID":"93e25cc4-9ac5-4e36-87b0-4523bba98b4b","Type":"ContainerDied","Data":"999b8f64a3c19b21dc80511ac25737d3a9d403b481e77c769b6d864026daa8c9"} Sep 30 17:21:33 crc kubenswrapper[4772]: I0930 17:21:33.909737 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be5891a-e27f-4f51-868f-90a7ade7d4bb" path="/var/lib/kubelet/pods/1be5891a-e27f-4f51-868f-90a7ade7d4bb/volumes" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.086973 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.134310 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.241504 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccbf6dc95-58jkq"] Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.241834 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" podUID="a13745ce-d903-4a06-a020-2095be2c3e55" containerName="dnsmasq-dns" containerID="cri-o://b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013" gracePeriod=10 Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.776855 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.777477 4772 generic.go:334] "Generic (PLEG): container finished" podID="7633806b-c365-4597-b298-1e9767c640d4" containerID="105254529cd8396980d60915c8aa90f7066b2c99b6235d19bead5afcd5b5e448" exitCode=0 Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.777545 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f85ns" event={"ID":"7633806b-c365-4597-b298-1e9767c640d4","Type":"ContainerDied","Data":"105254529cd8396980d60915c8aa90f7066b2c99b6235d19bead5afcd5b5e448"} Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.780594 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb","Type":"ContainerStarted","Data":"17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469"} Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.780633 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb","Type":"ContainerStarted","Data":"b247136d4ed465ef0c28d9bfc61fdfa608785fcb5d4bf552e3d711d30708aa03"} Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.782340 4772 generic.go:334] "Generic (PLEG): container finished" podID="a13745ce-d903-4a06-a020-2095be2c3e55" containerID="b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013" exitCode=0 Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.782401 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" event={"ID":"a13745ce-d903-4a06-a020-2095be2c3e55","Type":"ContainerDied","Data":"b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013"} Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.782424 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.782541 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccbf6dc95-58jkq" event={"ID":"a13745ce-d903-4a06-a020-2095be2c3e55","Type":"ContainerDied","Data":"172ce5b9832b8557654eb5391204d7d6bcbe10c163804f66ceffb3366cc6a900"} Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.782567 4772 scope.go:117] "RemoveContainer" containerID="b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.808560 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.821458 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.844514 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-dns-svc\") pod \"a13745ce-d903-4a06-a020-2095be2c3e55\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.844676 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-config\") pod \"a13745ce-d903-4a06-a020-2095be2c3e55\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.844783 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-sb\") pod \"a13745ce-d903-4a06-a020-2095be2c3e55\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.844852 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-nb\") pod \"a13745ce-d903-4a06-a020-2095be2c3e55\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.844939 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvzp4\" (UniqueName: \"kubernetes.io/projected/a13745ce-d903-4a06-a020-2095be2c3e55-kube-api-access-cvzp4\") pod \"a13745ce-d903-4a06-a020-2095be2c3e55\" (UID: \"a13745ce-d903-4a06-a020-2095be2c3e55\") " Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.852843 4772 scope.go:117] "RemoveContainer" containerID="5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.866268 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13745ce-d903-4a06-a020-2095be2c3e55-kube-api-access-cvzp4" (OuterVolumeSpecName: "kube-api-access-cvzp4") pod "a13745ce-d903-4a06-a020-2095be2c3e55" (UID: "a13745ce-d903-4a06-a020-2095be2c3e55"). InnerVolumeSpecName "kube-api-access-cvzp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.938130 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a13745ce-d903-4a06-a020-2095be2c3e55" (UID: "a13745ce-d903-4a06-a020-2095be2c3e55"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.948386 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-config" (OuterVolumeSpecName: "config") pod "a13745ce-d903-4a06-a020-2095be2c3e55" (UID: "a13745ce-d903-4a06-a020-2095be2c3e55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.949232 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.949260 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.949275 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvzp4\" (UniqueName: \"kubernetes.io/projected/a13745ce-d903-4a06-a020-2095be2c3e55-kube-api-access-cvzp4\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.954442 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a13745ce-d903-4a06-a020-2095be2c3e55" (UID: "a13745ce-d903-4a06-a020-2095be2c3e55"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.957601 4772 scope.go:117] "RemoveContainer" containerID="b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013" Sep 30 17:21:34 crc kubenswrapper[4772]: E0930 17:21:34.960335 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013\": container with ID starting with b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013 not found: ID does not exist" containerID="b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.960393 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013"} err="failed to get container status \"b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013\": rpc error: code = NotFound desc = could not find container \"b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013\": container with ID starting with b49f38c5e4ffa038953ca8d134b71aff7a974ae66dd1f0f1be729dfc6c04c013 not found: ID does not exist" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.960419 4772 scope.go:117] "RemoveContainer" containerID="5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e" Sep 30 17:21:34 crc kubenswrapper[4772]: E0930 17:21:34.962753 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e\": container with ID starting with 5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e not found: ID does not exist" containerID="5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.962807 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e"} err="failed to get container status \"5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e\": rpc error: code = NotFound desc = could not find container \"5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e\": container with ID starting with 5c80179e97d5cb460f508fe091aa7be429fe54fdde696274109034440c936c2e not found: ID does not exist" Sep 30 17:21:34 crc kubenswrapper[4772]: I0930 17:21:34.990825 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a13745ce-d903-4a06-a020-2095be2c3e55" (UID: "a13745ce-d903-4a06-a020-2095be2c3e55"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.052452 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.053519 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13745ce-d903-4a06-a020-2095be2c3e55-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.131182 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccbf6dc95-58jkq"] Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.145993 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ccbf6dc95-58jkq"] Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.329570 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.468873 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-db-sync-config-data\") pod \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.468942 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-combined-ca-bundle\") pod \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.469041 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z8h8\" (UniqueName: \"kubernetes.io/projected/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-kube-api-access-4z8h8\") pod \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\" (UID: \"93e25cc4-9ac5-4e36-87b0-4523bba98b4b\") " Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.472812 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "93e25cc4-9ac5-4e36-87b0-4523bba98b4b" (UID: "93e25cc4-9ac5-4e36-87b0-4523bba98b4b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.474185 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-kube-api-access-4z8h8" (OuterVolumeSpecName: "kube-api-access-4z8h8") pod "93e25cc4-9ac5-4e36-87b0-4523bba98b4b" (UID: "93e25cc4-9ac5-4e36-87b0-4523bba98b4b"). InnerVolumeSpecName "kube-api-access-4z8h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.498650 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93e25cc4-9ac5-4e36-87b0-4523bba98b4b" (UID: "93e25cc4-9ac5-4e36-87b0-4523bba98b4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.577507 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z8h8\" (UniqueName: \"kubernetes.io/projected/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-kube-api-access-4z8h8\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.577572 4772 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.577584 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e25cc4-9ac5-4e36-87b0-4523bba98b4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.804198 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb","Type":"ContainerStarted","Data":"9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42"} Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.804276 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb","Type":"ContainerStarted","Data":"8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6"} Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.807792 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wfrg6" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.807902 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wfrg6" event={"ID":"93e25cc4-9ac5-4e36-87b0-4523bba98b4b","Type":"ContainerDied","Data":"f8bfea4fefe47ca6ea9bc782144f60ccca4eea41c81df9f281683e0c170d3c23"} Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.807938 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8bfea4fefe47ca6ea9bc782144f60ccca4eea41c81df9f281683e0c170d3c23" Sep 30 17:21:35 crc kubenswrapper[4772]: I0930 17:21:35.911503 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13745ce-d903-4a06-a020-2095be2c3e55" path="/var/lib/kubelet/pods/a13745ce-d903-4a06-a020-2095be2c3e55/volumes" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.106959 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-658db7799c-88bsl"] Sep 30 17:21:36 crc kubenswrapper[4772]: E0930 17:21:36.108538 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13745ce-d903-4a06-a020-2095be2c3e55" containerName="init" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.108559 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13745ce-d903-4a06-a020-2095be2c3e55" containerName="init" Sep 30 17:21:36 crc kubenswrapper[4772]: E0930 17:21:36.108577 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e25cc4-9ac5-4e36-87b0-4523bba98b4b" containerName="barbican-db-sync" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.108583 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e25cc4-9ac5-4e36-87b0-4523bba98b4b" containerName="barbican-db-sync" Sep 30 17:21:36 crc kubenswrapper[4772]: E0930 17:21:36.108594 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13745ce-d903-4a06-a020-2095be2c3e55" containerName="dnsmasq-dns" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.108600 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13745ce-d903-4a06-a020-2095be2c3e55" containerName="dnsmasq-dns" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.108764 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13745ce-d903-4a06-a020-2095be2c3e55" containerName="dnsmasq-dns" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.108809 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e25cc4-9ac5-4e36-87b0-4523bba98b4b" containerName="barbican-db-sync" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.110832 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.113241 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w8gft" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.133008 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.133160 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.149294 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-f749c9554-fhqsc"] Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.151024 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.157068 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.171960 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f749c9554-fhqsc"] Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193494 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dc6bc7-3f82-4108-afa6-15ac7055676a-logs\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193568 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b84c03-7c14-47a5-9f86-cca25e0bf92e-logs\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193589 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dc6bc7-3f82-4108-afa6-15ac7055676a-combined-ca-bundle\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193631 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b84c03-7c14-47a5-9f86-cca25e0bf92e-combined-ca-bundle\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193652 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hj72\" (UniqueName: \"kubernetes.io/projected/84dc6bc7-3f82-4108-afa6-15ac7055676a-kube-api-access-2hj72\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193711 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cskp2\" (UniqueName: \"kubernetes.io/projected/31b84c03-7c14-47a5-9f86-cca25e0bf92e-kube-api-access-cskp2\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193760 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b84c03-7c14-47a5-9f86-cca25e0bf92e-config-data\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193798 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31b84c03-7c14-47a5-9f86-cca25e0bf92e-config-data-custom\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193817 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dc6bc7-3f82-4108-afa6-15ac7055676a-config-data\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193848 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84dc6bc7-3f82-4108-afa6-15ac7055676a-config-data-custom\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.193890 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-658db7799c-88bsl"] Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.295572 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dc6bc7-3f82-4108-afa6-15ac7055676a-logs\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.295620 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b84c03-7c14-47a5-9f86-cca25e0bf92e-logs\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.295644 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dc6bc7-3f82-4108-afa6-15ac7055676a-combined-ca-bundle\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.295669 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b84c03-7c14-47a5-9f86-cca25e0bf92e-combined-ca-bundle\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.295692 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hj72\" (UniqueName: \"kubernetes.io/projected/84dc6bc7-3f82-4108-afa6-15ac7055676a-kube-api-access-2hj72\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.295728 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cskp2\" (UniqueName: \"kubernetes.io/projected/31b84c03-7c14-47a5-9f86-cca25e0bf92e-kube-api-access-cskp2\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.295766 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b84c03-7c14-47a5-9f86-cca25e0bf92e-config-data\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.295787 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31b84c03-7c14-47a5-9f86-cca25e0bf92e-config-data-custom\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.295808 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dc6bc7-3f82-4108-afa6-15ac7055676a-config-data\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.295834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84dc6bc7-3f82-4108-afa6-15ac7055676a-config-data-custom\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.297161 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dc6bc7-3f82-4108-afa6-15ac7055676a-logs\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.297451 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b84c03-7c14-47a5-9f86-cca25e0bf92e-logs\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.302170 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-645559f75f-cs4x9"] Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.304457 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.310786 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dc6bc7-3f82-4108-afa6-15ac7055676a-config-data\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.311554 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84dc6bc7-3f82-4108-afa6-15ac7055676a-config-data-custom\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.311620 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b84c03-7c14-47a5-9f86-cca25e0bf92e-combined-ca-bundle\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.313220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dc6bc7-3f82-4108-afa6-15ac7055676a-combined-ca-bundle\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.318817 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-645559f75f-cs4x9"] Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.322490 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31b84c03-7c14-47a5-9f86-cca25e0bf92e-config-data-custom\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.331642 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hj72\" (UniqueName: \"kubernetes.io/projected/84dc6bc7-3f82-4108-afa6-15ac7055676a-kube-api-access-2hj72\") pod \"barbican-keystone-listener-f749c9554-fhqsc\" (UID: \"84dc6bc7-3f82-4108-afa6-15ac7055676a\") " pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.331426 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b84c03-7c14-47a5-9f86-cca25e0bf92e-config-data\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.337987 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cskp2\" (UniqueName: \"kubernetes.io/projected/31b84c03-7c14-47a5-9f86-cca25e0bf92e-kube-api-access-cskp2\") pod \"barbican-worker-658db7799c-88bsl\" (UID: \"31b84c03-7c14-47a5-9f86-cca25e0bf92e\") " pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.382035 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56f46f54c4-2r9ck"] Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.386526 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.393240 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.394371 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56f46f54c4-2r9ck"] Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.397023 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-nb\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.397118 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-config\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.397170 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-dns-svc\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.397217 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2whr\" (UniqueName: \"kubernetes.io/projected/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-kube-api-access-h2whr\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.397250 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-sb\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.437419 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f85ns" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.476837 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-658db7799c-88bsl" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.498907 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-db-sync-config-data\") pod \"7633806b-c365-4597-b298-1e9767c640d4\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499146 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-config-data\") pod \"7633806b-c365-4597-b298-1e9767c640d4\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499173 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7633806b-c365-4597-b298-1e9767c640d4-etc-machine-id\") pod \"7633806b-c365-4597-b298-1e9767c640d4\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499194 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-scripts\") pod \"7633806b-c365-4597-b298-1e9767c640d4\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499222 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-combined-ca-bundle\") pod \"7633806b-c365-4597-b298-1e9767c640d4\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499278 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mp98\" (UniqueName: \"kubernetes.io/projected/7633806b-c365-4597-b298-1e9767c640d4-kube-api-access-8mp98\") pod \"7633806b-c365-4597-b298-1e9767c640d4\" (UID: \"7633806b-c365-4597-b298-1e9767c640d4\") " Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499474 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-nb\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499503 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-config\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499544 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-dns-svc\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499570 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ddsv\" (UniqueName: \"kubernetes.io/projected/22b34d70-b7d3-4191-b033-2d2f50b324a6-kube-api-access-2ddsv\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499611 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499630 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2whr\" (UniqueName: \"kubernetes.io/projected/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-kube-api-access-h2whr\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499658 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-sb\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499679 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data-custom\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499710 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22b34d70-b7d3-4191-b033-2d2f50b324a6-logs\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.499746 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-combined-ca-bundle\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.502219 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-dns-svc\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.502290 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7633806b-c365-4597-b298-1e9767c640d4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7633806b-c365-4597-b298-1e9767c640d4" (UID: "7633806b-c365-4597-b298-1e9767c640d4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.505324 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.507402 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7633806b-c365-4597-b298-1e9767c640d4" (UID: "7633806b-c365-4597-b298-1e9767c640d4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.507461 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-scripts" (OuterVolumeSpecName: "scripts") pod "7633806b-c365-4597-b298-1e9767c640d4" (UID: "7633806b-c365-4597-b298-1e9767c640d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.508687 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-sb\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.510637 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-config\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.510869 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7633806b-c365-4597-b298-1e9767c640d4-kube-api-access-8mp98" (OuterVolumeSpecName: "kube-api-access-8mp98") pod "7633806b-c365-4597-b298-1e9767c640d4" (UID: "7633806b-c365-4597-b298-1e9767c640d4"). InnerVolumeSpecName "kube-api-access-8mp98". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.521248 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-nb\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.532622 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2whr\" (UniqueName: \"kubernetes.io/projected/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-kube-api-access-h2whr\") pod \"dnsmasq-dns-645559f75f-cs4x9\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.546094 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7633806b-c365-4597-b298-1e9767c640d4" (UID: "7633806b-c365-4597-b298-1e9767c640d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.587865 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-config-data" (OuterVolumeSpecName: "config-data") pod "7633806b-c365-4597-b298-1e9767c640d4" (UID: "7633806b-c365-4597-b298-1e9767c640d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.623703 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ddsv\" (UniqueName: \"kubernetes.io/projected/22b34d70-b7d3-4191-b033-2d2f50b324a6-kube-api-access-2ddsv\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.623820 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.623898 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data-custom\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.623952 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22b34d70-b7d3-4191-b033-2d2f50b324a6-logs\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.624037 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-combined-ca-bundle\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.624201 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.624217 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7633806b-c365-4597-b298-1e9767c640d4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.624229 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.624237 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.624248 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mp98\" (UniqueName: \"kubernetes.io/projected/7633806b-c365-4597-b298-1e9767c640d4-kube-api-access-8mp98\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.624258 4772 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7633806b-c365-4597-b298-1e9767c640d4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.624803 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22b34d70-b7d3-4191-b033-2d2f50b324a6-logs\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.631037 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data-custom\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.631269 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.632715 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-combined-ca-bundle\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.642719 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ddsv\" (UniqueName: \"kubernetes.io/projected/22b34d70-b7d3-4191-b033-2d2f50b324a6-kube-api-access-2ddsv\") pod \"barbican-api-56f46f54c4-2r9ck\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.754343 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.772359 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.828276 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f85ns" event={"ID":"7633806b-c365-4597-b298-1e9767c640d4","Type":"ContainerDied","Data":"5838cdfd76fc76331754a992f938e0cb8306218dc6c073658c7df445460ee976"} Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.828317 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5838cdfd76fc76331754a992f938e0cb8306218dc6c073658c7df445460ee976" Sep 30 17:21:36 crc kubenswrapper[4772]: I0930 17:21:36.828373 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f85ns" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.044219 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-658db7799c-88bsl"] Sep 30 17:21:37 crc kubenswrapper[4772]: W0930 17:21:37.050461 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31b84c03_7c14_47a5_9f86_cca25e0bf92e.slice/crio-41513d3c4ee308ec1d98f389df7da1669c4a9a6862354855aaa385cdbd4827bf WatchSource:0}: Error finding container 41513d3c4ee308ec1d98f389df7da1669c4a9a6862354855aaa385cdbd4827bf: Status 404 returned error can't find the container with id 41513d3c4ee308ec1d98f389df7da1669c4a9a6862354855aaa385cdbd4827bf Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.136901 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:21:37 crc kubenswrapper[4772]: E0930 17:21:37.137872 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7633806b-c365-4597-b298-1e9767c640d4" containerName="cinder-db-sync" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.137903 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7633806b-c365-4597-b298-1e9767c640d4" containerName="cinder-db-sync" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.138180 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7633806b-c365-4597-b298-1e9767c640d4" containerName="cinder-db-sync" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.143259 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.154279 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.154485 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5qb4b" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.154618 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.154833 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.173890 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.192824 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f749c9554-fhqsc"] Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.255268 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.255318 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.255367 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c18094d7-cfdf-4700-ba98-b38d1a7959e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.255410 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.255445 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.255504 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc2wq\" (UniqueName: \"kubernetes.io/projected/c18094d7-cfdf-4700-ba98-b38d1a7959e7-kube-api-access-pc2wq\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.258015 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-645559f75f-cs4x9"] Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.330648 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69bb54587f-wpxpd"] Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.336176 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.357383 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c18094d7-cfdf-4700-ba98-b38d1a7959e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.357472 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.357525 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.357594 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc2wq\" (UniqueName: \"kubernetes.io/projected/c18094d7-cfdf-4700-ba98-b38d1a7959e7-kube-api-access-pc2wq\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.357676 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.357891 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.358508 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c18094d7-cfdf-4700-ba98-b38d1a7959e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.359709 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69bb54587f-wpxpd"] Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.373229 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.374623 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.375543 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.376483 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.406311 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc2wq\" (UniqueName: \"kubernetes.io/projected/c18094d7-cfdf-4700-ba98-b38d1a7959e7-kube-api-access-pc2wq\") pod \"cinder-scheduler-0\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.430017 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.431931 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.433893 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.461717 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-sb\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.461839 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-786rv\" (UniqueName: \"kubernetes.io/projected/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-kube-api-access-786rv\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.461922 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-nb\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.461965 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-config\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.461998 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-dns-svc\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.488178 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.488575 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.510146 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-645559f75f-cs4x9"] Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.564679 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-sb\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.565550 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-scripts\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.565753 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnr9f\" (UniqueName: \"kubernetes.io/projected/1f96fca1-c22d-49b4-ad56-c646c72c7807-kube-api-access-lnr9f\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.566004 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data-custom\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.566964 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-786rv\" (UniqueName: \"kubernetes.io/projected/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-kube-api-access-786rv\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.568294 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f96fca1-c22d-49b4-ad56-c646c72c7807-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.565642 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-sb\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.568474 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.568568 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-nb\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.568644 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f96fca1-c22d-49b4-ad56-c646c72c7807-logs\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.568689 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-config\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.568725 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-dns-svc\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.568806 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.569629 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-nb\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.570176 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-config\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.570793 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-dns-svc\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.600400 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-786rv\" (UniqueName: \"kubernetes.io/projected/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-kube-api-access-786rv\") pod \"dnsmasq-dns-69bb54587f-wpxpd\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.671424 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.671745 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-scripts\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.671791 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnr9f\" (UniqueName: \"kubernetes.io/projected/1f96fca1-c22d-49b4-ad56-c646c72c7807-kube-api-access-lnr9f\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.671834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data-custom\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.671868 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f96fca1-c22d-49b4-ad56-c646c72c7807-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.671930 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.672001 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f96fca1-c22d-49b4-ad56-c646c72c7807-logs\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.672349 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f96fca1-c22d-49b4-ad56-c646c72c7807-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.674204 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f96fca1-c22d-49b4-ad56-c646c72c7807-logs\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.678752 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-scripts\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.680106 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data-custom\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.682019 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.683233 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.685366 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56f46f54c4-2r9ck"] Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.696187 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnr9f\" (UniqueName: \"kubernetes.io/projected/1f96fca1-c22d-49b4-ad56-c646c72c7807-kube-api-access-lnr9f\") pod \"cinder-api-0\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.775305 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.796430 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.866795 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-658db7799c-88bsl" event={"ID":"31b84c03-7c14-47a5-9f86-cca25e0bf92e","Type":"ContainerStarted","Data":"41513d3c4ee308ec1d98f389df7da1669c4a9a6862354855aaa385cdbd4827bf"} Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.871559 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb","Type":"ContainerStarted","Data":"96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d"} Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.873398 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.883201 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-645559f75f-cs4x9" event={"ID":"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f","Type":"ContainerStarted","Data":"ae1ec69295ef88328e3404285c76c4fd419f4a23c7e83bf09b56bc817cb27c2d"} Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.884797 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" event={"ID":"84dc6bc7-3f82-4108-afa6-15ac7055676a","Type":"ContainerStarted","Data":"34e68d49528fe90415aca2abe307296bb948e6ba8cdf582f3feb93e2a10f9e2a"} Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.891448 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f46f54c4-2r9ck" event={"ID":"22b34d70-b7d3-4191-b033-2d2f50b324a6","Type":"ContainerStarted","Data":"317d5c5351813c59d9f42830ad9aedf140483ec47f7d3155e3a79cfd89b1fc70"} Sep 30 17:21:37 crc kubenswrapper[4772]: I0930 17:21:37.909846 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.779441711 podStartE2EDuration="4.909825342s" podCreationTimestamp="2025-09-30 17:21:33 +0000 UTC" firstStartedPulling="2025-09-30 17:21:34.09697751 +0000 UTC m=+1195.003990341" lastFinishedPulling="2025-09-30 17:21:37.227361151 +0000 UTC m=+1198.134373972" observedRunningTime="2025-09-30 17:21:37.898409727 +0000 UTC m=+1198.805422558" watchObservedRunningTime="2025-09-30 17:21:37.909825342 +0000 UTC m=+1198.816838183" Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.079166 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:21:38 crc kubenswrapper[4772]: E0930 17:21:38.136922 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod638f409e_860b_47a5_b2ee_b0d9fe2b3c2f.slice/crio-9ecdb966fb5bc85cbb65d26c70526aa568217cd97a618f0475d5f6d2576ca52b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod638f409e_860b_47a5_b2ee_b0d9fe2b3c2f.slice/crio-conmon-9ecdb966fb5bc85cbb65d26c70526aa568217cd97a618f0475d5f6d2576ca52b.scope\": RecentStats: unable to find data in memory cache]" Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.470387 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69bb54587f-wpxpd"] Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.533390 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.534219 4772 scope.go:117] "RemoveContainer" containerID="70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28" Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.585783 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.655255 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.655308 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.655347 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.656038 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"efa0334e5be43d3bffa768f2acb0e43691dcf91743c608a3a66ab0007419afd9"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.656120 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://efa0334e5be43d3bffa768f2acb0e43691dcf91743c608a3a66ab0007419afd9" gracePeriod=600 Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.903809 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f46f54c4-2r9ck" event={"ID":"22b34d70-b7d3-4191-b033-2d2f50b324a6","Type":"ContainerStarted","Data":"36dd36f7c3a99113ebccb8c28277c3c72ac46a3b3ac586ed8d8f6de24520c99d"} Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.905826 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" event={"ID":"bf848a80-86ac-41c1-a85e-5cc4fb6e4192","Type":"ContainerStarted","Data":"2105055913c3aea0e4fb8ee22947cfdca46f50309efffb7be011974da2aa3993"} Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.907534 4772 generic.go:334] "Generic (PLEG): container finished" podID="638f409e-860b-47a5-b2ee-b0d9fe2b3c2f" containerID="9ecdb966fb5bc85cbb65d26c70526aa568217cd97a618f0475d5f6d2576ca52b" exitCode=0 Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.907585 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-645559f75f-cs4x9" event={"ID":"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f","Type":"ContainerDied","Data":"9ecdb966fb5bc85cbb65d26c70526aa568217cd97a618f0475d5f6d2576ca52b"} Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.909314 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c18094d7-cfdf-4700-ba98-b38d1a7959e7","Type":"ContainerStarted","Data":"444ad0c45ec066d045da14774788e2df3e3189095072553f3a0df8bd04435fca"} Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.914377 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="efa0334e5be43d3bffa768f2acb0e43691dcf91743c608a3a66ab0007419afd9" exitCode=0 Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.915927 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"efa0334e5be43d3bffa768f2acb0e43691dcf91743c608a3a66ab0007419afd9"} Sep 30 17:21:38 crc kubenswrapper[4772]: I0930 17:21:38.915979 4772 scope.go:117] "RemoveContainer" containerID="abe03f1cdb5c96e46a9cb2863de12ede67a8becb76c4e1cb373ac762e5589161" Sep 30 17:21:39 crc kubenswrapper[4772]: W0930 17:21:39.352405 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f96fca1_c22d_49b4_ad56_c646c72c7807.slice/crio-8c2845b9cafa73bdd4596a05b9429614511965610082a1ffc192ae6bf58f747b WatchSource:0}: Error finding container 8c2845b9cafa73bdd4596a05b9429614511965610082a1ffc192ae6bf58f747b: Status 404 returned error can't find the container with id 8c2845b9cafa73bdd4596a05b9429614511965610082a1ffc192ae6bf58f747b Sep 30 17:21:39 crc kubenswrapper[4772]: I0930 17:21:39.668696 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:39 crc kubenswrapper[4772]: I0930 17:21:39.730750 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-nb\") pod \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " Sep 30 17:21:39 crc kubenswrapper[4772]: I0930 17:21:39.730894 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-sb\") pod \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " Sep 30 17:21:39 crc kubenswrapper[4772]: I0930 17:21:39.732505 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-dns-svc\") pod \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " Sep 30 17:21:39 crc kubenswrapper[4772]: I0930 17:21:39.732555 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2whr\" (UniqueName: \"kubernetes.io/projected/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-kube-api-access-h2whr\") pod \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " Sep 30 17:21:39 crc kubenswrapper[4772]: I0930 17:21:39.732602 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-config\") pod \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\" (UID: \"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f\") " Sep 30 17:21:39 crc kubenswrapper[4772]: I0930 17:21:39.781525 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-kube-api-access-h2whr" (OuterVolumeSpecName: "kube-api-access-h2whr") pod "638f409e-860b-47a5-b2ee-b0d9fe2b3c2f" (UID: "638f409e-860b-47a5-b2ee-b0d9fe2b3c2f"). InnerVolumeSpecName "kube-api-access-h2whr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:39 crc kubenswrapper[4772]: I0930 17:21:39.836100 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2whr\" (UniqueName: \"kubernetes.io/projected/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-kube-api-access-h2whr\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:39 crc kubenswrapper[4772]: I0930 17:21:39.984488 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.019080 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-645559f75f-cs4x9" event={"ID":"638f409e-860b-47a5-b2ee-b0d9fe2b3c2f","Type":"ContainerDied","Data":"ae1ec69295ef88328e3404285c76c4fd419f4a23c7e83bf09b56bc817cb27c2d"} Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.019164 4772 scope.go:117] "RemoveContainer" containerID="9ecdb966fb5bc85cbb65d26c70526aa568217cd97a618f0475d5f6d2576ca52b" Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.019272 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-645559f75f-cs4x9" Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.028085 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f96fca1-c22d-49b4-ad56-c646c72c7807","Type":"ContainerStarted","Data":"8c2845b9cafa73bdd4596a05b9429614511965610082a1ffc192ae6bf58f747b"} Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.069787 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-config" (OuterVolumeSpecName: "config") pod "638f409e-860b-47a5-b2ee-b0d9fe2b3c2f" (UID: "638f409e-860b-47a5-b2ee-b0d9fe2b3c2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.085493 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "638f409e-860b-47a5-b2ee-b0d9fe2b3c2f" (UID: "638f409e-860b-47a5-b2ee-b0d9fe2b3c2f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.154273 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.154306 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.187194 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "638f409e-860b-47a5-b2ee-b0d9fe2b3c2f" (UID: "638f409e-860b-47a5-b2ee-b0d9fe2b3c2f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.241017 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "638f409e-860b-47a5-b2ee-b0d9fe2b3c2f" (UID: "638f409e-860b-47a5-b2ee-b0d9fe2b3c2f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.270416 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.270663 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.549665 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-645559f75f-cs4x9"] Sep 30 17:21:40 crc kubenswrapper[4772]: I0930 17:21:40.554403 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-645559f75f-cs4x9"] Sep 30 17:21:41 crc kubenswrapper[4772]: I0930 17:21:41.040242 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"69f02322-0ff1-410e-8b46-dd3b5f909963","Type":"ContainerStarted","Data":"925c4324dbfd6497d97e3bf754fbbe81ec7472b79ed9fa25fd5fffb0aaf527a6"} Sep 30 17:21:41 crc kubenswrapper[4772]: I0930 17:21:41.043021 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f46f54c4-2r9ck" event={"ID":"22b34d70-b7d3-4191-b033-2d2f50b324a6","Type":"ContainerStarted","Data":"9f90a0c7c82f6de94a3af86271fbe8a8a943b8d3bdd2a97f97252f1ba531f302"} Sep 30 17:21:41 crc kubenswrapper[4772]: I0930 17:21:41.043247 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:41 crc kubenswrapper[4772]: I0930 17:21:41.043398 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:41 crc kubenswrapper[4772]: I0930 17:21:41.044907 4772 generic.go:334] "Generic (PLEG): container finished" podID="bf848a80-86ac-41c1-a85e-5cc4fb6e4192" containerID="73d7693961fa0263231f72a02cb4626cc42303b8b633abe6832e335b3e51a4e8" exitCode=0 Sep 30 17:21:41 crc kubenswrapper[4772]: I0930 17:21:41.044993 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" event={"ID":"bf848a80-86ac-41c1-a85e-5cc4fb6e4192","Type":"ContainerDied","Data":"73d7693961fa0263231f72a02cb4626cc42303b8b633abe6832e335b3e51a4e8"} Sep 30 17:21:41 crc kubenswrapper[4772]: I0930 17:21:41.047534 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" event={"ID":"84dc6bc7-3f82-4108-afa6-15ac7055676a","Type":"ContainerStarted","Data":"2e7c0f8f2c6c8f3ebfc77b2f1c2107c7c2c5fecd2704e69ec4be7d8cf4dc990a"} Sep 30 17:21:41 crc kubenswrapper[4772]: I0930 17:21:41.050172 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"0167984dc474e8f0e251ca86d3847ef4b3ab076e2cb16fe9125a3f852650eb68"} Sep 30 17:21:41 crc kubenswrapper[4772]: I0930 17:21:41.055796 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-658db7799c-88bsl" event={"ID":"31b84c03-7c14-47a5-9f86-cca25e0bf92e","Type":"ContainerStarted","Data":"416ca1a18009c81beeac08f3f4e2bc8c2b0560f2ae014814bc2548799be6aa8e"} Sep 30 17:21:41 crc kubenswrapper[4772]: I0930 17:21:41.923942 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="638f409e-860b-47a5-b2ee-b0d9fe2b3c2f" path="/var/lib/kubelet/pods/638f409e-860b-47a5-b2ee-b0d9fe2b3c2f/volumes" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.094632 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f96fca1-c22d-49b4-ad56-c646c72c7807","Type":"ContainerStarted","Data":"aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16"} Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.099264 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" event={"ID":"84dc6bc7-3f82-4108-afa6-15ac7055676a","Type":"ContainerStarted","Data":"12606d6dee4f03e3e31d8b5d13f7d9c8507ee5c2297742865ca04b11b4a422a8"} Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.101424 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c18094d7-cfdf-4700-ba98-b38d1a7959e7","Type":"ContainerStarted","Data":"f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007"} Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.109898 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-658db7799c-88bsl" event={"ID":"31b84c03-7c14-47a5-9f86-cca25e0bf92e","Type":"ContainerStarted","Data":"6f597d50e01af817c3508205b63b8c2785a84bcaee5aaa49d1eb3f1921738b34"} Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.126940 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56f46f54c4-2r9ck" podStartSLOduration=6.126910594 podStartE2EDuration="6.126910594s" podCreationTimestamp="2025-09-30 17:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:41.132354725 +0000 UTC m=+1202.039367556" watchObservedRunningTime="2025-09-30 17:21:42.126910594 +0000 UTC m=+1203.033923425" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.132780 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-f749c9554-fhqsc" podStartSLOduration=3.8873516710000002 podStartE2EDuration="6.132763326s" podCreationTimestamp="2025-09-30 17:21:36 +0000 UTC" firstStartedPulling="2025-09-30 17:21:37.204867819 +0000 UTC m=+1198.111880640" lastFinishedPulling="2025-09-30 17:21:39.450279464 +0000 UTC m=+1200.357292295" observedRunningTime="2025-09-30 17:21:42.125642242 +0000 UTC m=+1203.032655073" watchObservedRunningTime="2025-09-30 17:21:42.132763326 +0000 UTC m=+1203.039776157" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.133079 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" event={"ID":"bf848a80-86ac-41c1-a85e-5cc4fb6e4192","Type":"ContainerStarted","Data":"19a824243e9fc35e89d39cd88bfa75befda3372aa2ef243efd7a59b2eb133d9c"} Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.169186 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-658db7799c-88bsl" podStartSLOduration=3.786147676 podStartE2EDuration="6.169157327s" podCreationTimestamp="2025-09-30 17:21:36 +0000 UTC" firstStartedPulling="2025-09-30 17:21:37.071887712 +0000 UTC m=+1197.978900543" lastFinishedPulling="2025-09-30 17:21:39.454897363 +0000 UTC m=+1200.361910194" observedRunningTime="2025-09-30 17:21:42.163003437 +0000 UTC m=+1203.070016268" watchObservedRunningTime="2025-09-30 17:21:42.169157327 +0000 UTC m=+1203.076170148" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.202118 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" podStartSLOduration=5.202086038 podStartE2EDuration="5.202086038s" podCreationTimestamp="2025-09-30 17:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:42.188720182 +0000 UTC m=+1203.095733013" watchObservedRunningTime="2025-09-30 17:21:42.202086038 +0000 UTC m=+1203.109098869" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.305665 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-859bb54b8b-6n9dj"] Sep 30 17:21:42 crc kubenswrapper[4772]: E0930 17:21:42.306322 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="638f409e-860b-47a5-b2ee-b0d9fe2b3c2f" containerName="init" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.306388 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="638f409e-860b-47a5-b2ee-b0d9fe2b3c2f" containerName="init" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.307868 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="638f409e-860b-47a5-b2ee-b0d9fe2b3c2f" containerName="init" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.309022 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.335182 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.337982 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.424543 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-859bb54b8b-6n9dj"] Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.454076 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-public-tls-certs\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.454193 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-config-data-custom\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.454283 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jzf5\" (UniqueName: \"kubernetes.io/projected/190ea63c-c6c0-47e8-988c-bd89113ef485-kube-api-access-2jzf5\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.454377 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-config-data\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.454430 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-combined-ca-bundle\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.454459 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190ea63c-c6c0-47e8-988c-bd89113ef485-logs\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.454532 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-internal-tls-certs\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.556854 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-combined-ca-bundle\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.556930 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190ea63c-c6c0-47e8-988c-bd89113ef485-logs\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.557036 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-internal-tls-certs\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.557085 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-public-tls-certs\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.557108 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-config-data-custom\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.557165 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jzf5\" (UniqueName: \"kubernetes.io/projected/190ea63c-c6c0-47e8-988c-bd89113ef485-kube-api-access-2jzf5\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.557233 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-config-data\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.611156 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-config-data\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.618835 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jzf5\" (UniqueName: \"kubernetes.io/projected/190ea63c-c6c0-47e8-988c-bd89113ef485-kube-api-access-2jzf5\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.620536 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190ea63c-c6c0-47e8-988c-bd89113ef485-logs\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.636405 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-config-data-custom\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.636894 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-public-tls-certs\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.637445 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-combined-ca-bundle\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.640686 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190ea63c-c6c0-47e8-988c-bd89113ef485-internal-tls-certs\") pod \"barbican-api-859bb54b8b-6n9dj\" (UID: \"190ea63c-c6c0-47e8-988c-bd89113ef485\") " pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.669133 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:42 crc kubenswrapper[4772]: I0930 17:21:42.775858 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:43 crc kubenswrapper[4772]: I0930 17:21:43.146582 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f96fca1-c22d-49b4-ad56-c646c72c7807","Type":"ContainerStarted","Data":"9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b"} Sep 30 17:21:43 crc kubenswrapper[4772]: I0930 17:21:43.146991 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1f96fca1-c22d-49b4-ad56-c646c72c7807" containerName="cinder-api-log" containerID="cri-o://aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16" gracePeriod=30 Sep 30 17:21:43 crc kubenswrapper[4772]: I0930 17:21:43.147166 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1f96fca1-c22d-49b4-ad56-c646c72c7807" containerName="cinder-api" containerID="cri-o://9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b" gracePeriod=30 Sep 30 17:21:43 crc kubenswrapper[4772]: I0930 17:21:43.147302 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 17:21:43 crc kubenswrapper[4772]: I0930 17:21:43.155298 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c18094d7-cfdf-4700-ba98-b38d1a7959e7","Type":"ContainerStarted","Data":"95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326"} Sep 30 17:21:43 crc kubenswrapper[4772]: I0930 17:21:43.194407 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.194388139 podStartE2EDuration="6.194388139s" podCreationTimestamp="2025-09-30 17:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:43.17197382 +0000 UTC m=+1204.078986651" watchObservedRunningTime="2025-09-30 17:21:43.194388139 +0000 UTC m=+1204.101400970" Sep 30 17:21:43 crc kubenswrapper[4772]: I0930 17:21:43.212098 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.899640679 podStartE2EDuration="6.212080726s" podCreationTimestamp="2025-09-30 17:21:37 +0000 UTC" firstStartedPulling="2025-09-30 17:21:38.099876135 +0000 UTC m=+1199.006888966" lastFinishedPulling="2025-09-30 17:21:39.412316182 +0000 UTC m=+1200.319329013" observedRunningTime="2025-09-30 17:21:43.211466431 +0000 UTC m=+1204.118479262" watchObservedRunningTime="2025-09-30 17:21:43.212080726 +0000 UTC m=+1204.119093547" Sep 30 17:21:43 crc kubenswrapper[4772]: I0930 17:21:43.242025 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-859bb54b8b-6n9dj"] Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.151972 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.179555 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-859bb54b8b-6n9dj" event={"ID":"190ea63c-c6c0-47e8-988c-bd89113ef485","Type":"ContainerStarted","Data":"b27133f153d5fa0db15e2befd0c28fed0aedce8a6f43255384cd4db933563636"} Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.179608 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-859bb54b8b-6n9dj" event={"ID":"190ea63c-c6c0-47e8-988c-bd89113ef485","Type":"ContainerStarted","Data":"d0fa5ae52c24fc7c944d07343b267969e2bc5cf2d3489cf514df5fe9741f9e50"} Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.182426 4772 generic.go:334] "Generic (PLEG): container finished" podID="1f96fca1-c22d-49b4-ad56-c646c72c7807" containerID="9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b" exitCode=0 Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.182469 4772 generic.go:334] "Generic (PLEG): container finished" podID="1f96fca1-c22d-49b4-ad56-c646c72c7807" containerID="aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16" exitCode=143 Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.182588 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f96fca1-c22d-49b4-ad56-c646c72c7807","Type":"ContainerDied","Data":"9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b"} Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.182664 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.182668 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f96fca1-c22d-49b4-ad56-c646c72c7807","Type":"ContainerDied","Data":"aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16"} Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.182721 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f96fca1-c22d-49b4-ad56-c646c72c7807","Type":"ContainerDied","Data":"8c2845b9cafa73bdd4596a05b9429614511965610082a1ffc192ae6bf58f747b"} Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.182748 4772 scope.go:117] "RemoveContainer" containerID="9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.201811 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-combined-ca-bundle\") pod \"1f96fca1-c22d-49b4-ad56-c646c72c7807\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.202329 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnr9f\" (UniqueName: \"kubernetes.io/projected/1f96fca1-c22d-49b4-ad56-c646c72c7807-kube-api-access-lnr9f\") pod \"1f96fca1-c22d-49b4-ad56-c646c72c7807\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.202418 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-scripts\") pod \"1f96fca1-c22d-49b4-ad56-c646c72c7807\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.202495 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f96fca1-c22d-49b4-ad56-c646c72c7807-etc-machine-id\") pod \"1f96fca1-c22d-49b4-ad56-c646c72c7807\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.202545 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data\") pod \"1f96fca1-c22d-49b4-ad56-c646c72c7807\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.202571 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f96fca1-c22d-49b4-ad56-c646c72c7807-logs\") pod \"1f96fca1-c22d-49b4-ad56-c646c72c7807\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.202749 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data-custom\") pod \"1f96fca1-c22d-49b4-ad56-c646c72c7807\" (UID: \"1f96fca1-c22d-49b4-ad56-c646c72c7807\") " Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.204536 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f96fca1-c22d-49b4-ad56-c646c72c7807-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1f96fca1-c22d-49b4-ad56-c646c72c7807" (UID: "1f96fca1-c22d-49b4-ad56-c646c72c7807"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.207446 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f96fca1-c22d-49b4-ad56-c646c72c7807-logs" (OuterVolumeSpecName: "logs") pod "1f96fca1-c22d-49b4-ad56-c646c72c7807" (UID: "1f96fca1-c22d-49b4-ad56-c646c72c7807"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.224851 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1f96fca1-c22d-49b4-ad56-c646c72c7807" (UID: "1f96fca1-c22d-49b4-ad56-c646c72c7807"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.225416 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f96fca1-c22d-49b4-ad56-c646c72c7807-kube-api-access-lnr9f" (OuterVolumeSpecName: "kube-api-access-lnr9f") pod "1f96fca1-c22d-49b4-ad56-c646c72c7807" (UID: "1f96fca1-c22d-49b4-ad56-c646c72c7807"). InnerVolumeSpecName "kube-api-access-lnr9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.227426 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-scripts" (OuterVolumeSpecName: "scripts") pod "1f96fca1-c22d-49b4-ad56-c646c72c7807" (UID: "1f96fca1-c22d-49b4-ad56-c646c72c7807"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.246460 4772 scope.go:117] "RemoveContainer" containerID="aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.272507 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f96fca1-c22d-49b4-ad56-c646c72c7807" (UID: "1f96fca1-c22d-49b4-ad56-c646c72c7807"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.303423 4772 scope.go:117] "RemoveContainer" containerID="9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.304989 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.305011 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnr9f\" (UniqueName: \"kubernetes.io/projected/1f96fca1-c22d-49b4-ad56-c646c72c7807-kube-api-access-lnr9f\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.305024 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.305035 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f96fca1-c22d-49b4-ad56-c646c72c7807-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.305047 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f96fca1-c22d-49b4-ad56-c646c72c7807-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.305076 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:44 crc kubenswrapper[4772]: E0930 17:21:44.306697 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b\": container with ID starting with 9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b not found: ID does not exist" containerID="9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.306741 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b"} err="failed to get container status \"9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b\": rpc error: code = NotFound desc = could not find container \"9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b\": container with ID starting with 9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b not found: ID does not exist" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.306769 4772 scope.go:117] "RemoveContainer" containerID="aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16" Sep 30 17:21:44 crc kubenswrapper[4772]: E0930 17:21:44.307658 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16\": container with ID starting with aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16 not found: ID does not exist" containerID="aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.307680 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16"} err="failed to get container status \"aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16\": rpc error: code = NotFound desc = could not find container \"aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16\": container with ID starting with aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16 not found: ID does not exist" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.307703 4772 scope.go:117] "RemoveContainer" containerID="9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.308910 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b"} err="failed to get container status \"9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b\": rpc error: code = NotFound desc = could not find container \"9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b\": container with ID starting with 9009e93be186cc962da67bc0b7a2aba047ecec02cc839fa5bcc77c6680eb5c3b not found: ID does not exist" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.308958 4772 scope.go:117] "RemoveContainer" containerID="aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.309694 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16"} err="failed to get container status \"aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16\": rpc error: code = NotFound desc = could not find container \"aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16\": container with ID starting with aa68acccc7a30093a647bd168350d63cc9bb78e4fbc0ca97fd14df0c11ed7d16 not found: ID does not exist" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.329226 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data" (OuterVolumeSpecName: "config-data") pod "1f96fca1-c22d-49b4-ad56-c646c72c7807" (UID: "1f96fca1-c22d-49b4-ad56-c646c72c7807"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.407520 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f96fca1-c22d-49b4-ad56-c646c72c7807-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.526886 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.539655 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.566101 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:21:44 crc kubenswrapper[4772]: E0930 17:21:44.566576 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f96fca1-c22d-49b4-ad56-c646c72c7807" containerName="cinder-api-log" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.566597 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f96fca1-c22d-49b4-ad56-c646c72c7807" containerName="cinder-api-log" Sep 30 17:21:44 crc kubenswrapper[4772]: E0930 17:21:44.566619 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f96fca1-c22d-49b4-ad56-c646c72c7807" containerName="cinder-api" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.566628 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f96fca1-c22d-49b4-ad56-c646c72c7807" containerName="cinder-api" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.566828 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f96fca1-c22d-49b4-ad56-c646c72c7807" containerName="cinder-api" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.566856 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f96fca1-c22d-49b4-ad56-c646c72c7807" containerName="cinder-api-log" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.568197 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.588828 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.595551 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.597287 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.597424 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.641580 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.641644 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-config-data\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.641751 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-scripts\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.641776 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-config-data-custom\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.641815 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71eb05f1-a375-49c7-965d-ae495649ac7c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.641863 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mc6r\" (UniqueName: \"kubernetes.io/projected/71eb05f1-a375-49c7-965d-ae495649ac7c-kube-api-access-8mc6r\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.641933 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.642012 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71eb05f1-a375-49c7-965d-ae495649ac7c-logs\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.642042 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.743546 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71eb05f1-a375-49c7-965d-ae495649ac7c-logs\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.743872 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.743980 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.744098 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71eb05f1-a375-49c7-965d-ae495649ac7c-logs\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.744218 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-config-data\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.744338 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-scripts\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.744408 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-config-data-custom\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.744485 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71eb05f1-a375-49c7-965d-ae495649ac7c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.744567 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mc6r\" (UniqueName: \"kubernetes.io/projected/71eb05f1-a375-49c7-965d-ae495649ac7c-kube-api-access-8mc6r\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.744663 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.744907 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71eb05f1-a375-49c7-965d-ae495649ac7c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.753829 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.754279 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.762775 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-config-data-custom\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.763174 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.763574 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-config-data\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.763952 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71eb05f1-a375-49c7-965d-ae495649ac7c-scripts\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.780899 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mc6r\" (UniqueName: \"kubernetes.io/projected/71eb05f1-a375-49c7-965d-ae495649ac7c-kube-api-access-8mc6r\") pod \"cinder-api-0\" (UID: \"71eb05f1-a375-49c7-965d-ae495649ac7c\") " pod="openstack/cinder-api-0" Sep 30 17:21:44 crc kubenswrapper[4772]: I0930 17:21:44.991229 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:21:45 crc kubenswrapper[4772]: I0930 17:21:45.208346 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-859bb54b8b-6n9dj" event={"ID":"190ea63c-c6c0-47e8-988c-bd89113ef485","Type":"ContainerStarted","Data":"a4624268e5036f129f8c02b20e18895e8261f555f157a574e54e49d6dc88fe26"} Sep 30 17:21:45 crc kubenswrapper[4772]: I0930 17:21:45.208827 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:45 crc kubenswrapper[4772]: I0930 17:21:45.208845 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:45 crc kubenswrapper[4772]: I0930 17:21:45.250838 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-859bb54b8b-6n9dj" podStartSLOduration=3.250814817 podStartE2EDuration="3.250814817s" podCreationTimestamp="2025-09-30 17:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:45.244496894 +0000 UTC m=+1206.151509725" watchObservedRunningTime="2025-09-30 17:21:45.250814817 +0000 UTC m=+1206.157827648" Sep 30 17:21:45 crc kubenswrapper[4772]: I0930 17:21:45.496981 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:21:45 crc kubenswrapper[4772]: I0930 17:21:45.911673 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f96fca1-c22d-49b4-ad56-c646c72c7807" path="/var/lib/kubelet/pods/1f96fca1-c22d-49b4-ad56-c646c72c7807/volumes" Sep 30 17:21:46 crc kubenswrapper[4772]: I0930 17:21:46.221805 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"71eb05f1-a375-49c7-965d-ae495649ac7c","Type":"ContainerStarted","Data":"e87d7847bfb4fd357e8319336d0ce4be22bef7ae5ae9ae25b3c5dce84dec60bd"} Sep 30 17:21:46 crc kubenswrapper[4772]: I0930 17:21:46.222147 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"71eb05f1-a375-49c7-965d-ae495649ac7c","Type":"ContainerStarted","Data":"3ce893499b6fa3235793779e76659f8e77247b89d33c364a0e581c9cab0a75d1"} Sep 30 17:21:46 crc kubenswrapper[4772]: I0930 17:21:46.224540 4772 generic.go:334] "Generic (PLEG): container finished" podID="69f02322-0ff1-410e-8b46-dd3b5f909963" containerID="925c4324dbfd6497d97e3bf754fbbe81ec7472b79ed9fa25fd5fffb0aaf527a6" exitCode=1 Sep 30 17:21:46 crc kubenswrapper[4772]: I0930 17:21:46.224814 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"69f02322-0ff1-410e-8b46-dd3b5f909963","Type":"ContainerDied","Data":"925c4324dbfd6497d97e3bf754fbbe81ec7472b79ed9fa25fd5fffb0aaf527a6"} Sep 30 17:21:46 crc kubenswrapper[4772]: I0930 17:21:46.224857 4772 scope.go:117] "RemoveContainer" containerID="70d6c1592902b48f2ae1a10ce3c9d04cbada90b8032fa1ef3929bbc54114bf28" Sep 30 17:21:46 crc kubenswrapper[4772]: I0930 17:21:46.225290 4772 scope.go:117] "RemoveContainer" containerID="925c4324dbfd6497d97e3bf754fbbe81ec7472b79ed9fa25fd5fffb0aaf527a6" Sep 30 17:21:46 crc kubenswrapper[4772]: E0930 17:21:46.225523 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(69f02322-0ff1-410e-8b46-dd3b5f909963)\"" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" Sep 30 17:21:47 crc kubenswrapper[4772]: I0930 17:21:47.235675 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"71eb05f1-a375-49c7-965d-ae495649ac7c","Type":"ContainerStarted","Data":"74954547babf2f3e669144e99494f39fdc31da2049875746fdfd823bb6a57ac0"} Sep 30 17:21:47 crc kubenswrapper[4772]: I0930 17:21:47.235922 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 17:21:47 crc kubenswrapper[4772]: I0930 17:21:47.262326 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.262308494 podStartE2EDuration="3.262308494s" podCreationTimestamp="2025-09-30 17:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:47.251455675 +0000 UTC m=+1208.158468496" watchObservedRunningTime="2025-09-30 17:21:47.262308494 +0000 UTC m=+1208.169321325" Sep 30 17:21:47 crc kubenswrapper[4772]: I0930 17:21:47.489656 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 17:21:47 crc kubenswrapper[4772]: I0930 17:21:47.595468 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-56f46f54c4-2r9ck" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:21:47 crc kubenswrapper[4772]: I0930 17:21:47.729386 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 17:21:47 crc kubenswrapper[4772]: I0930 17:21:47.777230 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:21:47 crc kubenswrapper[4772]: I0930 17:21:47.861823 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7974f5645c-fp6cj"] Sep 30 17:21:47 crc kubenswrapper[4772]: I0930 17:21:47.862461 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" podUID="01ee93d8-ee88-45b4-b28a-9446a95be4e0" containerName="dnsmasq-dns" containerID="cri-o://dcd3f39d0c024c9e0caf4846678ea0353a772ab1a1e46917a832c3239a45ee5f" gracePeriod=10 Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.264988 4772 generic.go:334] "Generic (PLEG): container finished" podID="01ee93d8-ee88-45b4-b28a-9446a95be4e0" containerID="dcd3f39d0c024c9e0caf4846678ea0353a772ab1a1e46917a832c3239a45ee5f" exitCode=0 Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.266321 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" event={"ID":"01ee93d8-ee88-45b4-b28a-9446a95be4e0","Type":"ContainerDied","Data":"dcd3f39d0c024c9e0caf4846678ea0353a772ab1a1e46917a832c3239a45ee5f"} Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.367533 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.502674 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.536010 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.536476 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.536619 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.536694 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.536931 4772 scope.go:117] "RemoveContainer" containerID="925c4324dbfd6497d97e3bf754fbbe81ec7472b79ed9fa25fd5fffb0aaf527a6" Sep 30 17:21:48 crc kubenswrapper[4772]: E0930 17:21:48.537206 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(69f02322-0ff1-410e-8b46-dd3b5f909963)\"" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.605918 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.640493 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.640652 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-nb\") pod \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.640736 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-dns-svc\") pod \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.640848 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-config\") pod \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.640951 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-sb\") pod \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.640974 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gqtv\" (UniqueName: \"kubernetes.io/projected/01ee93d8-ee88-45b4-b28a-9446a95be4e0-kube-api-access-9gqtv\") pod \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\" (UID: \"01ee93d8-ee88-45b4-b28a-9446a95be4e0\") " Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.664470 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ee93d8-ee88-45b4-b28a-9446a95be4e0-kube-api-access-9gqtv" (OuterVolumeSpecName: "kube-api-access-9gqtv") pod "01ee93d8-ee88-45b4-b28a-9446a95be4e0" (UID: "01ee93d8-ee88-45b4-b28a-9446a95be4e0"). InnerVolumeSpecName "kube-api-access-9gqtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.725009 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01ee93d8-ee88-45b4-b28a-9446a95be4e0" (UID: "01ee93d8-ee88-45b4-b28a-9446a95be4e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.743030 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-config" (OuterVolumeSpecName: "config") pod "01ee93d8-ee88-45b4-b28a-9446a95be4e0" (UID: "01ee93d8-ee88-45b4-b28a-9446a95be4e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.745679 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.745710 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.745720 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gqtv\" (UniqueName: \"kubernetes.io/projected/01ee93d8-ee88-45b4-b28a-9446a95be4e0-kube-api-access-9gqtv\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.747262 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "01ee93d8-ee88-45b4-b28a-9446a95be4e0" (UID: "01ee93d8-ee88-45b4-b28a-9446a95be4e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.785401 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "01ee93d8-ee88-45b4-b28a-9446a95be4e0" (UID: "01ee93d8-ee88-45b4-b28a-9446a95be4e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.847537 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:48 crc kubenswrapper[4772]: I0930 17:21:48.847578 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01ee93d8-ee88-45b4-b28a-9446a95be4e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:49 crc kubenswrapper[4772]: I0930 17:21:49.279179 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" event={"ID":"01ee93d8-ee88-45b4-b28a-9446a95be4e0","Type":"ContainerDied","Data":"9369b98fcdf6de225e6881bcaa6cb24fc091e6d1b8b607f56ed12c0441b88294"} Sep 30 17:21:49 crc kubenswrapper[4772]: I0930 17:21:49.279537 4772 scope.go:117] "RemoveContainer" containerID="dcd3f39d0c024c9e0caf4846678ea0353a772ab1a1e46917a832c3239a45ee5f" Sep 30 17:21:49 crc kubenswrapper[4772]: I0930 17:21:49.279930 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7974f5645c-fp6cj" Sep 30 17:21:49 crc kubenswrapper[4772]: I0930 17:21:49.280667 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" containerName="cinder-scheduler" containerID="cri-o://f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007" gracePeriod=30 Sep 30 17:21:49 crc kubenswrapper[4772]: I0930 17:21:49.281211 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" containerName="probe" containerID="cri-o://95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326" gracePeriod=30 Sep 30 17:21:49 crc kubenswrapper[4772]: I0930 17:21:49.281612 4772 scope.go:117] "RemoveContainer" containerID="925c4324dbfd6497d97e3bf754fbbe81ec7472b79ed9fa25fd5fffb0aaf527a6" Sep 30 17:21:49 crc kubenswrapper[4772]: E0930 17:21:49.281890 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(69f02322-0ff1-410e-8b46-dd3b5f909963)\"" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" Sep 30 17:21:49 crc kubenswrapper[4772]: I0930 17:21:49.323036 4772 scope.go:117] "RemoveContainer" containerID="4baa482295d82285360960fd3490435046dd55d7ecb5ec153e282315df0b350a" Sep 30 17:21:49 crc kubenswrapper[4772]: I0930 17:21:49.327233 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7974f5645c-fp6cj"] Sep 30 17:21:49 crc kubenswrapper[4772]: I0930 17:21:49.334106 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7974f5645c-fp6cj"] Sep 30 17:21:49 crc kubenswrapper[4772]: I0930 17:21:49.910289 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ee93d8-ee88-45b4-b28a-9446a95be4e0" path="/var/lib/kubelet/pods/01ee93d8-ee88-45b4-b28a-9446a95be4e0/volumes" Sep 30 17:21:50 crc kubenswrapper[4772]: I0930 17:21:50.289866 4772 generic.go:334] "Generic (PLEG): container finished" podID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" containerID="95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326" exitCode=0 Sep 30 17:21:50 crc kubenswrapper[4772]: I0930 17:21:50.289956 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c18094d7-cfdf-4700-ba98-b38d1a7959e7","Type":"ContainerDied","Data":"95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326"} Sep 30 17:21:50 crc kubenswrapper[4772]: I0930 17:21:50.867935 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:21:50 crc kubenswrapper[4772]: I0930 17:21:50.992404 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc2wq\" (UniqueName: \"kubernetes.io/projected/c18094d7-cfdf-4700-ba98-b38d1a7959e7-kube-api-access-pc2wq\") pod \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " Sep 30 17:21:50 crc kubenswrapper[4772]: I0930 17:21:50.992475 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c18094d7-cfdf-4700-ba98-b38d1a7959e7-etc-machine-id\") pod \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " Sep 30 17:21:50 crc kubenswrapper[4772]: I0930 17:21:50.992573 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data-custom\") pod \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " Sep 30 17:21:50 crc kubenswrapper[4772]: I0930 17:21:50.992624 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-scripts\") pod \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " Sep 30 17:21:50 crc kubenswrapper[4772]: I0930 17:21:50.992713 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data\") pod \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " Sep 30 17:21:50 crc kubenswrapper[4772]: I0930 17:21:50.992752 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-combined-ca-bundle\") pod \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\" (UID: \"c18094d7-cfdf-4700-ba98-b38d1a7959e7\") " Sep 30 17:21:50 crc kubenswrapper[4772]: I0930 17:21:50.993479 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c18094d7-cfdf-4700-ba98-b38d1a7959e7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c18094d7-cfdf-4700-ba98-b38d1a7959e7" (UID: "c18094d7-cfdf-4700-ba98-b38d1a7959e7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:50.998084 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c18094d7-cfdf-4700-ba98-b38d1a7959e7" (UID: "c18094d7-cfdf-4700-ba98-b38d1a7959e7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:50.998470 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-scripts" (OuterVolumeSpecName: "scripts") pod "c18094d7-cfdf-4700-ba98-b38d1a7959e7" (UID: "c18094d7-cfdf-4700-ba98-b38d1a7959e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:50.998796 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18094d7-cfdf-4700-ba98-b38d1a7959e7-kube-api-access-pc2wq" (OuterVolumeSpecName: "kube-api-access-pc2wq") pod "c18094d7-cfdf-4700-ba98-b38d1a7959e7" (UID: "c18094d7-cfdf-4700-ba98-b38d1a7959e7"). InnerVolumeSpecName "kube-api-access-pc2wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.059320 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c18094d7-cfdf-4700-ba98-b38d1a7959e7" (UID: "c18094d7-cfdf-4700-ba98-b38d1a7959e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.079005 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data" (OuterVolumeSpecName: "config-data") pod "c18094d7-cfdf-4700-ba98-b38d1a7959e7" (UID: "c18094d7-cfdf-4700-ba98-b38d1a7959e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.094556 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.094589 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc2wq\" (UniqueName: \"kubernetes.io/projected/c18094d7-cfdf-4700-ba98-b38d1a7959e7-kube-api-access-pc2wq\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.094600 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c18094d7-cfdf-4700-ba98-b38d1a7959e7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.094609 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.094618 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.094628 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18094d7-cfdf-4700-ba98-b38d1a7959e7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.313909 4772 generic.go:334] "Generic (PLEG): container finished" podID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" containerID="f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007" exitCode=0 Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.313956 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c18094d7-cfdf-4700-ba98-b38d1a7959e7","Type":"ContainerDied","Data":"f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007"} Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.313988 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c18094d7-cfdf-4700-ba98-b38d1a7959e7","Type":"ContainerDied","Data":"444ad0c45ec066d045da14774788e2df3e3189095072553f3a0df8bd04435fca"} Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.314010 4772 scope.go:117] "RemoveContainer" containerID="95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.314023 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.344385 4772 scope.go:117] "RemoveContainer" containerID="f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.350581 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.366533 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.378813 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:21:51 crc kubenswrapper[4772]: E0930 17:21:51.379337 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" containerName="cinder-scheduler" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.379355 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" containerName="cinder-scheduler" Sep 30 17:21:51 crc kubenswrapper[4772]: E0930 17:21:51.379371 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ee93d8-ee88-45b4-b28a-9446a95be4e0" containerName="init" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.379377 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ee93d8-ee88-45b4-b28a-9446a95be4e0" containerName="init" Sep 30 17:21:51 crc kubenswrapper[4772]: E0930 17:21:51.379403 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" containerName="probe" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.379409 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" containerName="probe" Sep 30 17:21:51 crc kubenswrapper[4772]: E0930 17:21:51.379422 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ee93d8-ee88-45b4-b28a-9446a95be4e0" containerName="dnsmasq-dns" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.379429 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ee93d8-ee88-45b4-b28a-9446a95be4e0" containerName="dnsmasq-dns" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.379601 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ee93d8-ee88-45b4-b28a-9446a95be4e0" containerName="dnsmasq-dns" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.379618 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" containerName="cinder-scheduler" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.379630 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" containerName="probe" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.380651 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.382576 4772 scope.go:117] "RemoveContainer" containerID="95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326" Sep 30 17:21:51 crc kubenswrapper[4772]: E0930 17:21:51.383562 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326\": container with ID starting with 95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326 not found: ID does not exist" containerID="95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.383597 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326"} err="failed to get container status \"95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326\": rpc error: code = NotFound desc = could not find container \"95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326\": container with ID starting with 95d1c19b00473a69b2cbc73201fd0a902ea572e55950145768f89c4e80c9e326 not found: ID does not exist" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.383622 4772 scope.go:117] "RemoveContainer" containerID="f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.384084 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 17:21:51 crc kubenswrapper[4772]: E0930 17:21:51.384205 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007\": container with ID starting with f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007 not found: ID does not exist" containerID="f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.384257 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007"} err="failed to get container status \"f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007\": rpc error: code = NotFound desc = could not find container \"f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007\": container with ID starting with f3442abdd84fee5244fe39a7741873338ef10320a72e4e828d9aa758ab1c8007 not found: ID does not exist" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.411127 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.502682 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e083d29-4d17-4b01-9201-dfbed0f1f304-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.502733 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-scripts\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.502795 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z8sm\" (UniqueName: \"kubernetes.io/projected/2e083d29-4d17-4b01-9201-dfbed0f1f304-kube-api-access-7z8sm\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.502821 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.502912 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-config-data\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.503428 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.605558 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-config-data\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.605651 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.605698 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e083d29-4d17-4b01-9201-dfbed0f1f304-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.605729 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-scripts\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.605799 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z8sm\" (UniqueName: \"kubernetes.io/projected/2e083d29-4d17-4b01-9201-dfbed0f1f304-kube-api-access-7z8sm\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.605840 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.607243 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e083d29-4d17-4b01-9201-dfbed0f1f304-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.613868 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.613998 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-config-data\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.621757 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.622157 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e083d29-4d17-4b01-9201-dfbed0f1f304-scripts\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.635480 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z8sm\" (UniqueName: \"kubernetes.io/projected/2e083d29-4d17-4b01-9201-dfbed0f1f304-kube-api-access-7z8sm\") pod \"cinder-scheduler-0\" (UID: \"2e083d29-4d17-4b01-9201-dfbed0f1f304\") " pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.698224 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:21:51 crc kubenswrapper[4772]: I0930 17:21:51.911729 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18094d7-cfdf-4700-ba98-b38d1a7959e7" path="/var/lib/kubelet/pods/c18094d7-cfdf-4700-ba98-b38d1a7959e7/volumes" Sep 30 17:21:52 crc kubenswrapper[4772]: I0930 17:21:52.211409 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:21:52 crc kubenswrapper[4772]: I0930 17:21:52.336129 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2e083d29-4d17-4b01-9201-dfbed0f1f304","Type":"ContainerStarted","Data":"8094d794e1bc6ab3b34704c70b6295288eb48c4211d339c4756042caffdc30e0"} Sep 30 17:21:53 crc kubenswrapper[4772]: I0930 17:21:53.358771 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2e083d29-4d17-4b01-9201-dfbed0f1f304","Type":"ContainerStarted","Data":"f2389f41c1308449b15a7c71951cffced56c395f3338916e2a3feeb8e4b409d6"} Sep 30 17:21:54 crc kubenswrapper[4772]: I0930 17:21:54.158520 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:54 crc kubenswrapper[4772]: I0930 17:21:54.281692 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:21:54 crc kubenswrapper[4772]: I0930 17:21:54.388910 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2e083d29-4d17-4b01-9201-dfbed0f1f304","Type":"ContainerStarted","Data":"ac7bf8a92a09df34546d9fc40c170aaa84d0a8245e498e00a6a1f2b19e0260a4"} Sep 30 17:21:54 crc kubenswrapper[4772]: I0930 17:21:54.440530 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.440505531 podStartE2EDuration="3.440505531s" podCreationTimestamp="2025-09-30 17:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:21:54.435275596 +0000 UTC m=+1215.342288427" watchObservedRunningTime="2025-09-30 17:21:54.440505531 +0000 UTC m=+1215.347518362" Sep 30 17:21:54 crc kubenswrapper[4772]: I0930 17:21:54.689145 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-859bb54b8b-6n9dj" Sep 30 17:21:54 crc kubenswrapper[4772]: I0930 17:21:54.767497 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56f46f54c4-2r9ck"] Sep 30 17:21:54 crc kubenswrapper[4772]: I0930 17:21:54.774415 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56f46f54c4-2r9ck" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api-log" containerID="cri-o://36dd36f7c3a99113ebccb8c28277c3c72ac46a3b3ac586ed8d8f6de24520c99d" gracePeriod=30 Sep 30 17:21:54 crc kubenswrapper[4772]: I0930 17:21:54.774910 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56f46f54c4-2r9ck" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api" containerID="cri-o://9f90a0c7c82f6de94a3af86271fbe8a8a943b8d3bdd2a97f97252f1ba531f302" gracePeriod=30 Sep 30 17:21:54 crc kubenswrapper[4772]: I0930 17:21:54.795886 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-56f46f54c4-2r9ck" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": EOF" Sep 30 17:21:54 crc kubenswrapper[4772]: I0930 17:21:54.796225 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-56f46f54c4-2r9ck" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": EOF" Sep 30 17:21:55 crc kubenswrapper[4772]: I0930 17:21:55.415481 4772 generic.go:334] "Generic (PLEG): container finished" podID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerID="36dd36f7c3a99113ebccb8c28277c3c72ac46a3b3ac586ed8d8f6de24520c99d" exitCode=143 Sep 30 17:21:55 crc kubenswrapper[4772]: I0930 17:21:55.416804 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f46f54c4-2r9ck" event={"ID":"22b34d70-b7d3-4191-b033-2d2f50b324a6","Type":"ContainerDied","Data":"36dd36f7c3a99113ebccb8c28277c3c72ac46a3b3ac586ed8d8f6de24520c99d"} Sep 30 17:21:56 crc kubenswrapper[4772]: I0930 17:21:56.698467 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 17:21:57 crc kubenswrapper[4772]: I0930 17:21:57.122897 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-684cbd44c-xstzf" Sep 30 17:21:57 crc kubenswrapper[4772]: I0930 17:21:57.188472 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5968d57d6b-kr75b"] Sep 30 17:21:57 crc kubenswrapper[4772]: I0930 17:21:57.188818 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5968d57d6b-kr75b" podUID="3b66dba4-4b8a-4340-97d1-f6c995748763" containerName="neutron-api" containerID="cri-o://52e6b764f4ed2804f0973e9b20adaf67a7c262b62645628878c16351d6c49f78" gracePeriod=30 Sep 30 17:21:57 crc kubenswrapper[4772]: I0930 17:21:57.188916 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5968d57d6b-kr75b" podUID="3b66dba4-4b8a-4340-97d1-f6c995748763" containerName="neutron-httpd" containerID="cri-o://157aa95c6a326da39e72adbee81d1195c38929f51a9219408226f55f5ce5efa0" gracePeriod=30 Sep 30 17:21:57 crc kubenswrapper[4772]: I0930 17:21:57.466625 4772 generic.go:334] "Generic (PLEG): container finished" podID="3b66dba4-4b8a-4340-97d1-f6c995748763" containerID="157aa95c6a326da39e72adbee81d1195c38929f51a9219408226f55f5ce5efa0" exitCode=0 Sep 30 17:21:57 crc kubenswrapper[4772]: I0930 17:21:57.466802 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5968d57d6b-kr75b" event={"ID":"3b66dba4-4b8a-4340-97d1-f6c995748763","Type":"ContainerDied","Data":"157aa95c6a326da39e72adbee81d1195c38929f51a9219408226f55f5ce5efa0"} Sep 30 17:21:57 crc kubenswrapper[4772]: I0930 17:21:57.946735 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56f46f54c4-2r9ck" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:39464->10.217.0.167:9311: read: connection reset by peer" Sep 30 17:21:57 crc kubenswrapper[4772]: I0930 17:21:57.947280 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56f46f54c4-2r9ck" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:39460->10.217.0.167:9311: read: connection reset by peer" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.491630 4772 generic.go:334] "Generic (PLEG): container finished" podID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerID="9f90a0c7c82f6de94a3af86271fbe8a8a943b8d3bdd2a97f97252f1ba531f302" exitCode=0 Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.492149 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f46f54c4-2r9ck" event={"ID":"22b34d70-b7d3-4191-b033-2d2f50b324a6","Type":"ContainerDied","Data":"9f90a0c7c82f6de94a3af86271fbe8a8a943b8d3bdd2a97f97252f1ba531f302"} Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.603920 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-76ff4c9cf5-7gpvg" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.636621 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.798149 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data\") pod \"22b34d70-b7d3-4191-b033-2d2f50b324a6\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.798232 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-combined-ca-bundle\") pod \"22b34d70-b7d3-4191-b033-2d2f50b324a6\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.798342 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22b34d70-b7d3-4191-b033-2d2f50b324a6-logs\") pod \"22b34d70-b7d3-4191-b033-2d2f50b324a6\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.798432 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ddsv\" (UniqueName: \"kubernetes.io/projected/22b34d70-b7d3-4191-b033-2d2f50b324a6-kube-api-access-2ddsv\") pod \"22b34d70-b7d3-4191-b033-2d2f50b324a6\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.798464 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data-custom\") pod \"22b34d70-b7d3-4191-b033-2d2f50b324a6\" (UID: \"22b34d70-b7d3-4191-b033-2d2f50b324a6\") " Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.808499 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22b34d70-b7d3-4191-b033-2d2f50b324a6-logs" (OuterVolumeSpecName: "logs") pod "22b34d70-b7d3-4191-b033-2d2f50b324a6" (UID: "22b34d70-b7d3-4191-b033-2d2f50b324a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.814446 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b34d70-b7d3-4191-b033-2d2f50b324a6-kube-api-access-2ddsv" (OuterVolumeSpecName: "kube-api-access-2ddsv") pod "22b34d70-b7d3-4191-b033-2d2f50b324a6" (UID: "22b34d70-b7d3-4191-b033-2d2f50b324a6"). InnerVolumeSpecName "kube-api-access-2ddsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.817858 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "22b34d70-b7d3-4191-b033-2d2f50b324a6" (UID: "22b34d70-b7d3-4191-b033-2d2f50b324a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.854650 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22b34d70-b7d3-4191-b033-2d2f50b324a6" (UID: "22b34d70-b7d3-4191-b033-2d2f50b324a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.878151 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data" (OuterVolumeSpecName: "config-data") pod "22b34d70-b7d3-4191-b033-2d2f50b324a6" (UID: "22b34d70-b7d3-4191-b033-2d2f50b324a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.905567 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.905606 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.905618 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22b34d70-b7d3-4191-b033-2d2f50b324a6-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.905627 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ddsv\" (UniqueName: \"kubernetes.io/projected/22b34d70-b7d3-4191-b033-2d2f50b324a6-kube-api-access-2ddsv\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.905638 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22b34d70-b7d3-4191-b033-2d2f50b324a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:58 crc kubenswrapper[4772]: I0930 17:21:58.984227 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 17:21:59 crc kubenswrapper[4772]: I0930 17:21:59.504372 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f46f54c4-2r9ck" event={"ID":"22b34d70-b7d3-4191-b033-2d2f50b324a6","Type":"ContainerDied","Data":"317d5c5351813c59d9f42830ad9aedf140483ec47f7d3155e3a79cfd89b1fc70"} Sep 30 17:21:59 crc kubenswrapper[4772]: I0930 17:21:59.504423 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56f46f54c4-2r9ck" Sep 30 17:21:59 crc kubenswrapper[4772]: I0930 17:21:59.504429 4772 scope.go:117] "RemoveContainer" containerID="9f90a0c7c82f6de94a3af86271fbe8a8a943b8d3bdd2a97f97252f1ba531f302" Sep 30 17:21:59 crc kubenswrapper[4772]: I0930 17:21:59.537408 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56f46f54c4-2r9ck"] Sep 30 17:21:59 crc kubenswrapper[4772]: I0930 17:21:59.548852 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56f46f54c4-2r9ck"] Sep 30 17:21:59 crc kubenswrapper[4772]: I0930 17:21:59.550184 4772 scope.go:117] "RemoveContainer" containerID="36dd36f7c3a99113ebccb8c28277c3c72ac46a3b3ac586ed8d8f6de24520c99d" Sep 30 17:21:59 crc kubenswrapper[4772]: I0930 17:21:59.911444 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" path="/var/lib/kubelet/pods/22b34d70-b7d3-4191-b033-2d2f50b324a6/volumes" Sep 30 17:22:00 crc kubenswrapper[4772]: I0930 17:22:00.898581 4772 scope.go:117] "RemoveContainer" containerID="925c4324dbfd6497d97e3bf754fbbe81ec7472b79ed9fa25fd5fffb0aaf527a6" Sep 30 17:22:00 crc kubenswrapper[4772]: E0930 17:22:00.898886 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(69f02322-0ff1-410e-8b46-dd3b5f909963)\"" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" Sep 30 17:22:01 crc kubenswrapper[4772]: I0930 17:22:01.918832 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.130527 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 17:22:02 crc kubenswrapper[4772]: E0930 17:22:02.130924 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.130943 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api" Sep 30 17:22:02 crc kubenswrapper[4772]: E0930 17:22:02.130985 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api-log" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.130993 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api-log" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.131545 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.131565 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b34d70-b7d3-4191-b033-2d2f50b324a6" containerName="barbican-api-log" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.132204 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.140605 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.140895 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.141028 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9hr9s" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.150125 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.292049 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45aef289-46c6-4393-9032-2fe923b5948a-openstack-config\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.292268 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45aef289-46c6-4393-9032-2fe923b5948a-openstack-config-secret\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.292311 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8rg7\" (UniqueName: \"kubernetes.io/projected/45aef289-46c6-4393-9032-2fe923b5948a-kube-api-access-j8rg7\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.292441 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45aef289-46c6-4393-9032-2fe923b5948a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.394405 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45aef289-46c6-4393-9032-2fe923b5948a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.394806 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45aef289-46c6-4393-9032-2fe923b5948a-openstack-config\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.396371 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45aef289-46c6-4393-9032-2fe923b5948a-openstack-config\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.397045 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45aef289-46c6-4393-9032-2fe923b5948a-openstack-config-secret\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.397104 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8rg7\" (UniqueName: \"kubernetes.io/projected/45aef289-46c6-4393-9032-2fe923b5948a-kube-api-access-j8rg7\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.403836 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45aef289-46c6-4393-9032-2fe923b5948a-openstack-config-secret\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.403955 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45aef289-46c6-4393-9032-2fe923b5948a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.416316 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8rg7\" (UniqueName: \"kubernetes.io/projected/45aef289-46c6-4393-9032-2fe923b5948a-kube-api-access-j8rg7\") pod \"openstackclient\" (UID: \"45aef289-46c6-4393-9032-2fe923b5948a\") " pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.482224 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.570117 4772 generic.go:334] "Generic (PLEG): container finished" podID="3b66dba4-4b8a-4340-97d1-f6c995748763" containerID="52e6b764f4ed2804f0973e9b20adaf67a7c262b62645628878c16351d6c49f78" exitCode=0 Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.570299 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5968d57d6b-kr75b" event={"ID":"3b66dba4-4b8a-4340-97d1-f6c995748763","Type":"ContainerDied","Data":"52e6b764f4ed2804f0973e9b20adaf67a7c262b62645628878c16351d6c49f78"} Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.671940 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.702412 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-config\") pod \"3b66dba4-4b8a-4340-97d1-f6c995748763\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.702450 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-combined-ca-bundle\") pod \"3b66dba4-4b8a-4340-97d1-f6c995748763\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.702545 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-ovndb-tls-certs\") pod \"3b66dba4-4b8a-4340-97d1-f6c995748763\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.702573 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znknl\" (UniqueName: \"kubernetes.io/projected/3b66dba4-4b8a-4340-97d1-f6c995748763-kube-api-access-znknl\") pod \"3b66dba4-4b8a-4340-97d1-f6c995748763\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.702621 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-httpd-config\") pod \"3b66dba4-4b8a-4340-97d1-f6c995748763\" (UID: \"3b66dba4-4b8a-4340-97d1-f6c995748763\") " Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.713401 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b66dba4-4b8a-4340-97d1-f6c995748763-kube-api-access-znknl" (OuterVolumeSpecName: "kube-api-access-znknl") pod "3b66dba4-4b8a-4340-97d1-f6c995748763" (UID: "3b66dba4-4b8a-4340-97d1-f6c995748763"). InnerVolumeSpecName "kube-api-access-znknl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.717244 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3b66dba4-4b8a-4340-97d1-f6c995748763" (UID: "3b66dba4-4b8a-4340-97d1-f6c995748763"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.765555 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-config" (OuterVolumeSpecName: "config") pod "3b66dba4-4b8a-4340-97d1-f6c995748763" (UID: "3b66dba4-4b8a-4340-97d1-f6c995748763"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.782236 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b66dba4-4b8a-4340-97d1-f6c995748763" (UID: "3b66dba4-4b8a-4340-97d1-f6c995748763"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.805085 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.805140 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.805176 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.805196 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znknl\" (UniqueName: \"kubernetes.io/projected/3b66dba4-4b8a-4340-97d1-f6c995748763-kube-api-access-znknl\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.805081 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3b66dba4-4b8a-4340-97d1-f6c995748763" (UID: "3b66dba4-4b8a-4340-97d1-f6c995748763"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:02 crc kubenswrapper[4772]: I0930 17:22:02.906387 4772 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b66dba4-4b8a-4340-97d1-f6c995748763-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4772]: I0930 17:22:03.012069 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 17:22:03 crc kubenswrapper[4772]: I0930 17:22:03.590441 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5968d57d6b-kr75b" event={"ID":"3b66dba4-4b8a-4340-97d1-f6c995748763","Type":"ContainerDied","Data":"4fa4ec9133bcf60c195f54bfff00ccce3018497f59fcd905fbccde491112e8bb"} Sep 30 17:22:03 crc kubenswrapper[4772]: I0930 17:22:03.590499 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5968d57d6b-kr75b" Sep 30 17:22:03 crc kubenswrapper[4772]: I0930 17:22:03.590774 4772 scope.go:117] "RemoveContainer" containerID="157aa95c6a326da39e72adbee81d1195c38929f51a9219408226f55f5ce5efa0" Sep 30 17:22:03 crc kubenswrapper[4772]: I0930 17:22:03.593558 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"45aef289-46c6-4393-9032-2fe923b5948a","Type":"ContainerStarted","Data":"48379bbcf44bdd4b25e56b2562ebc7789ef5b1dfd87e6b9f5e9b45cadd70c07e"} Sep 30 17:22:03 crc kubenswrapper[4772]: I0930 17:22:03.616217 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 17:22:03 crc kubenswrapper[4772]: I0930 17:22:03.633071 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5968d57d6b-kr75b"] Sep 30 17:22:03 crc kubenswrapper[4772]: I0930 17:22:03.635210 4772 scope.go:117] "RemoveContainer" containerID="52e6b764f4ed2804f0973e9b20adaf67a7c262b62645628878c16351d6c49f78" Sep 30 17:22:03 crc kubenswrapper[4772]: I0930 17:22:03.644674 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5968d57d6b-kr75b"] Sep 30 17:22:03 crc kubenswrapper[4772]: I0930 17:22:03.911169 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b66dba4-4b8a-4340-97d1-f6c995748763" path="/var/lib/kubelet/pods/3b66dba4-4b8a-4340-97d1-f6c995748763/volumes" Sep 30 17:22:09 crc kubenswrapper[4772]: I0930 17:22:09.331665 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:22:09 crc kubenswrapper[4772]: I0930 17:22:09.333037 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="f7de151f-3a4a-46c0-ae33-74cb5da8b13a" containerName="kube-state-metrics" containerID="cri-o://c3a6cb858479d78e84807d01e01b63af66640a5000e12b8f2245386e6d6b788b" gracePeriod=30 Sep 30 17:22:09 crc kubenswrapper[4772]: I0930 17:22:09.662003 4772 generic.go:334] "Generic (PLEG): container finished" podID="f7de151f-3a4a-46c0-ae33-74cb5da8b13a" containerID="c3a6cb858479d78e84807d01e01b63af66640a5000e12b8f2245386e6d6b788b" exitCode=2 Sep 30 17:22:09 crc kubenswrapper[4772]: I0930 17:22:09.662044 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f7de151f-3a4a-46c0-ae33-74cb5da8b13a","Type":"ContainerDied","Data":"c3a6cb858479d78e84807d01e01b63af66640a5000e12b8f2245386e6d6b788b"} Sep 30 17:22:10 crc kubenswrapper[4772]: I0930 17:22:10.351749 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:10 crc kubenswrapper[4772]: I0930 17:22:10.352691 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="ceilometer-central-agent" containerID="cri-o://17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469" gracePeriod=30 Sep 30 17:22:10 crc kubenswrapper[4772]: I0930 17:22:10.352719 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="proxy-httpd" containerID="cri-o://96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d" gracePeriod=30 Sep 30 17:22:10 crc kubenswrapper[4772]: I0930 17:22:10.353143 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="sg-core" containerID="cri-o://9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42" gracePeriod=30 Sep 30 17:22:10 crc kubenswrapper[4772]: I0930 17:22:10.353203 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="ceilometer-notification-agent" containerID="cri-o://8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6" gracePeriod=30 Sep 30 17:22:10 crc kubenswrapper[4772]: I0930 17:22:10.691372 4772 generic.go:334] "Generic (PLEG): container finished" podID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerID="96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d" exitCode=0 Sep 30 17:22:10 crc kubenswrapper[4772]: I0930 17:22:10.691469 4772 generic.go:334] "Generic (PLEG): container finished" podID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerID="9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42" exitCode=2 Sep 30 17:22:10 crc kubenswrapper[4772]: I0930 17:22:10.691507 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb","Type":"ContainerDied","Data":"96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d"} Sep 30 17:22:10 crc kubenswrapper[4772]: I0930 17:22:10.691545 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb","Type":"ContainerDied","Data":"9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42"} Sep 30 17:22:11 crc kubenswrapper[4772]: I0930 17:22:11.705560 4772 generic.go:334] "Generic (PLEG): container finished" podID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerID="17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469" exitCode=0 Sep 30 17:22:11 crc kubenswrapper[4772]: I0930 17:22:11.705608 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb","Type":"ContainerDied","Data":"17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469"} Sep 30 17:22:12 crc kubenswrapper[4772]: I0930 17:22:12.765298 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="f7de151f-3a4a-46c0-ae33-74cb5da8b13a" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": dial tcp 10.217.0.114:8081: connect: connection refused" Sep 30 17:22:13 crc kubenswrapper[4772]: I0930 17:22:13.899472 4772 scope.go:117] "RemoveContainer" containerID="925c4324dbfd6497d97e3bf754fbbe81ec7472b79ed9fa25fd5fffb0aaf527a6" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.112118 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.220664 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgxbw\" (UniqueName: \"kubernetes.io/projected/f7de151f-3a4a-46c0-ae33-74cb5da8b13a-kube-api-access-hgxbw\") pod \"f7de151f-3a4a-46c0-ae33-74cb5da8b13a\" (UID: \"f7de151f-3a4a-46c0-ae33-74cb5da8b13a\") " Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.225102 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7de151f-3a4a-46c0-ae33-74cb5da8b13a-kube-api-access-hgxbw" (OuterVolumeSpecName: "kube-api-access-hgxbw") pod "f7de151f-3a4a-46c0-ae33-74cb5da8b13a" (UID: "f7de151f-3a4a-46c0-ae33-74cb5da8b13a"). InnerVolumeSpecName "kube-api-access-hgxbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.325821 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgxbw\" (UniqueName: \"kubernetes.io/projected/f7de151f-3a4a-46c0-ae33-74cb5da8b13a-kube-api-access-hgxbw\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.498707 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.632561 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-combined-ca-bundle\") pod \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.632682 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-sg-core-conf-yaml\") pod \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.632722 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-scripts\") pod \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.632748 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v4sk\" (UniqueName: \"kubernetes.io/projected/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-kube-api-access-7v4sk\") pod \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.632782 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-config-data\") pod \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.632841 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-log-httpd\") pod \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.633032 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-run-httpd\") pod \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\" (UID: \"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb\") " Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.633726 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" (UID: "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.633795 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" (UID: "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.640236 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-scripts" (OuterVolumeSpecName: "scripts") pod "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" (UID: "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.650419 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-kube-api-access-7v4sk" (OuterVolumeSpecName: "kube-api-access-7v4sk") pod "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" (UID: "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb"). InnerVolumeSpecName "kube-api-access-7v4sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.687138 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" (UID: "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.735381 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.735413 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.735443 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.735459 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v4sk\" (UniqueName: \"kubernetes.io/projected/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-kube-api-access-7v4sk\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.735469 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.738918 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" (UID: "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.741826 4772 generic.go:334] "Generic (PLEG): container finished" podID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerID="8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6" exitCode=0 Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.741937 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.741899 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb","Type":"ContainerDied","Data":"8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6"} Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.742235 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb","Type":"ContainerDied","Data":"b247136d4ed465ef0c28d9bfc61fdfa608785fcb5d4bf552e3d711d30708aa03"} Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.742262 4772 scope.go:117] "RemoveContainer" containerID="96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.745010 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"45aef289-46c6-4393-9032-2fe923b5948a","Type":"ContainerStarted","Data":"ab5c3637f60713ea1c1f8de8c8eddda07dd7ca96d2435dbfff796fe8a80ab102"} Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.747703 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"69f02322-0ff1-410e-8b46-dd3b5f909963","Type":"ContainerStarted","Data":"68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58"} Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.750691 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f7de151f-3a4a-46c0-ae33-74cb5da8b13a","Type":"ContainerDied","Data":"a83d8c9bcfd68fa37bb64636e67bd35da6baf110424298177271fa142918bfb5"} Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.750737 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.774373 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.63198405 podStartE2EDuration="12.774341591s" podCreationTimestamp="2025-09-30 17:22:02 +0000 UTC" firstStartedPulling="2025-09-30 17:22:03.01726152 +0000 UTC m=+1223.924274351" lastFinishedPulling="2025-09-30 17:22:14.159619061 +0000 UTC m=+1235.066631892" observedRunningTime="2025-09-30 17:22:14.760224106 +0000 UTC m=+1235.667236957" watchObservedRunningTime="2025-09-30 17:22:14.774341591 +0000 UTC m=+1235.681354422" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.776813 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-config-data" (OuterVolumeSpecName: "config-data") pod "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" (UID: "8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.799200 4772 scope.go:117] "RemoveContainer" containerID="9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.823973 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.825084 4772 scope.go:117] "RemoveContainer" containerID="8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.853938 4772 scope.go:117] "RemoveContainer" containerID="17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.855355 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.855442 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.860047 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.871615 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.872212 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7de151f-3a4a-46c0-ae33-74cb5da8b13a" containerName="kube-state-metrics" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872235 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7de151f-3a4a-46c0-ae33-74cb5da8b13a" containerName="kube-state-metrics" Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.872244 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="ceilometer-central-agent" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872251 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="ceilometer-central-agent" Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.872271 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b66dba4-4b8a-4340-97d1-f6c995748763" containerName="neutron-api" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872281 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b66dba4-4b8a-4340-97d1-f6c995748763" containerName="neutron-api" Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.872287 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b66dba4-4b8a-4340-97d1-f6c995748763" containerName="neutron-httpd" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872293 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b66dba4-4b8a-4340-97d1-f6c995748763" containerName="neutron-httpd" Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.872308 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="ceilometer-notification-agent" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872314 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="ceilometer-notification-agent" Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.872341 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="sg-core" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872348 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="sg-core" Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.872366 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="proxy-httpd" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872373 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="proxy-httpd" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872603 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="sg-core" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872619 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7de151f-3a4a-46c0-ae33-74cb5da8b13a" containerName="kube-state-metrics" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872634 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="ceilometer-notification-agent" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872646 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="proxy-httpd" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872659 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b66dba4-4b8a-4340-97d1-f6c995748763" containerName="neutron-httpd" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872674 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" containerName="ceilometer-central-agent" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.872686 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b66dba4-4b8a-4340-97d1-f6c995748763" containerName="neutron-api" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.873704 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.875669 4772 scope.go:117] "RemoveContainer" containerID="96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d" Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.876652 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d\": container with ID starting with 96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d not found: ID does not exist" containerID="96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.876710 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.876704 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d"} err="failed to get container status \"96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d\": rpc error: code = NotFound desc = could not find container \"96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d\": container with ID starting with 96ecd76d1daa0b170b7381e11a363dedc5b6f95835b6238fc79dee7c75b4a18d not found: ID does not exist" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.876754 4772 scope.go:117] "RemoveContainer" containerID="9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.876978 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.877177 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42\": container with ID starting with 9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42 not found: ID does not exist" containerID="9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.877214 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42"} err="failed to get container status \"9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42\": rpc error: code = NotFound desc = could not find container \"9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42\": container with ID starting with 9e110139766f3aedcd2abea3f1777c133e00a0530d351106a5a8a88086896e42 not found: ID does not exist" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.877270 4772 scope.go:117] "RemoveContainer" containerID="8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6" Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.878484 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6\": container with ID starting with 8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6 not found: ID does not exist" containerID="8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.878511 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6"} err="failed to get container status \"8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6\": rpc error: code = NotFound desc = could not find container \"8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6\": container with ID starting with 8fa8a9243f4f31d1937bc4c1099b8d833eb5b403d00750d29fc5c584616821c6 not found: ID does not exist" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.878529 4772 scope.go:117] "RemoveContainer" containerID="17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469" Sep 30 17:22:14 crc kubenswrapper[4772]: E0930 17:22:14.878784 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469\": container with ID starting with 17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469 not found: ID does not exist" containerID="17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.878807 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469"} err="failed to get container status \"17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469\": rpc error: code = NotFound desc = could not find container \"17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469\": container with ID starting with 17e88b51f505bb57c35e4253450243c47e14cba26dec94e087a67f52612c0469 not found: ID does not exist" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.878821 4772 scope.go:117] "RemoveContainer" containerID="c3a6cb858479d78e84807d01e01b63af66640a5000e12b8f2245386e6d6b788b" Sep 30 17:22:14 crc kubenswrapper[4772]: I0930 17:22:14.882524 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.059684 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6465x\" (UniqueName: \"kubernetes.io/projected/ed142cd3-4d43-4293-af1f-d2a76649b5a2-kube-api-access-6465x\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.059842 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed142cd3-4d43-4293-af1f-d2a76649b5a2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.060812 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed142cd3-4d43-4293-af1f-d2a76649b5a2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.060971 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed142cd3-4d43-4293-af1f-d2a76649b5a2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.079190 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.090969 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.098370 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.101098 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.103980 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.104016 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.104312 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.116774 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.166476 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed142cd3-4d43-4293-af1f-d2a76649b5a2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.166580 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6465x\" (UniqueName: \"kubernetes.io/projected/ed142cd3-4d43-4293-af1f-d2a76649b5a2-kube-api-access-6465x\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.166639 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed142cd3-4d43-4293-af1f-d2a76649b5a2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.166663 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed142cd3-4d43-4293-af1f-d2a76649b5a2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.173391 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed142cd3-4d43-4293-af1f-d2a76649b5a2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.194743 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed142cd3-4d43-4293-af1f-d2a76649b5a2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.197760 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed142cd3-4d43-4293-af1f-d2a76649b5a2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.198194 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6465x\" (UniqueName: \"kubernetes.io/projected/ed142cd3-4d43-4293-af1f-d2a76649b5a2-kube-api-access-6465x\") pod \"kube-state-metrics-0\" (UID: \"ed142cd3-4d43-4293-af1f-d2a76649b5a2\") " pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.272249 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-config-data\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.272297 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.272345 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-scripts\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.272392 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-run-httpd\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.272426 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-log-httpd\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.272471 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.272815 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.272943 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6299\" (UniqueName: \"kubernetes.io/projected/2570b35a-4d00-46e8-aca6-e8fa15d6347c-kube-api-access-j6299\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.375278 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-config-data\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.375355 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.375424 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-scripts\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.375498 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-run-httpd\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.375538 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-log-httpd\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.375592 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.375653 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.375696 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6299\" (UniqueName: \"kubernetes.io/projected/2570b35a-4d00-46e8-aca6-e8fa15d6347c-kube-api-access-j6299\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.376505 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-run-httpd\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.376875 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-log-httpd\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.380838 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.391479 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.392095 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-scripts\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.393166 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.395189 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-config-data\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.397099 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6299\" (UniqueName: \"kubernetes.io/projected/2570b35a-4d00-46e8-aca6-e8fa15d6347c-kube-api-access-j6299\") pod \"ceilometer-0\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.496925 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.507251 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.508177 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.920870 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb" path="/var/lib/kubelet/pods/8f38b557-b4c1-40f4-8a4c-aa1dde4b79bb/volumes" Sep 30 17:22:15 crc kubenswrapper[4772]: I0930 17:22:15.922600 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7de151f-3a4a-46c0-ae33-74cb5da8b13a" path="/var/lib/kubelet/pods/f7de151f-3a4a-46c0-ae33-74cb5da8b13a/volumes" Sep 30 17:22:16 crc kubenswrapper[4772]: I0930 17:22:16.003001 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:16 crc kubenswrapper[4772]: W0930 17:22:16.016821 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2570b35a_4d00_46e8_aca6_e8fa15d6347c.slice/crio-368b0ae298f9cfe60257b0cc88113218ebb40381c06b42ade1397cd046675f2e WatchSource:0}: Error finding container 368b0ae298f9cfe60257b0cc88113218ebb40381c06b42ade1397cd046675f2e: Status 404 returned error can't find the container with id 368b0ae298f9cfe60257b0cc88113218ebb40381c06b42ade1397cd046675f2e Sep 30 17:22:16 crc kubenswrapper[4772]: I0930 17:22:16.281151 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:22:16 crc kubenswrapper[4772]: I0930 17:22:16.798229 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2570b35a-4d00-46e8-aca6-e8fa15d6347c","Type":"ContainerStarted","Data":"368b0ae298f9cfe60257b0cc88113218ebb40381c06b42ade1397cd046675f2e"} Sep 30 17:22:16 crc kubenswrapper[4772]: I0930 17:22:16.800160 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ed142cd3-4d43-4293-af1f-d2a76649b5a2","Type":"ContainerStarted","Data":"41a4326cacc5ad33c6e69f00e0d900891a0a2828a87fd92eab60efb72d5cd285"} Sep 30 17:22:17 crc kubenswrapper[4772]: I0930 17:22:17.818406 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2570b35a-4d00-46e8-aca6-e8fa15d6347c","Type":"ContainerStarted","Data":"727ff0e7a40c42d7caa98ada062b906e9238959f1937467fb3f913fb0406c5ff"} Sep 30 17:22:18 crc kubenswrapper[4772]: I0930 17:22:18.534379 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 17:22:18 crc kubenswrapper[4772]: I0930 17:22:18.534787 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 17:22:18 crc kubenswrapper[4772]: E0930 17:22:18.534716 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58 is running failed: container process not found" containerID="68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Sep 30 17:22:18 crc kubenswrapper[4772]: E0930 17:22:18.535217 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58 is running failed: container process not found" containerID="68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Sep 30 17:22:18 crc kubenswrapper[4772]: E0930 17:22:18.535471 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58 is running failed: container process not found" containerID="68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Sep 30 17:22:18 crc kubenswrapper[4772]: E0930 17:22:18.535499 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58 is running failed: container process not found" probeType="Startup" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" containerName="watcher-decision-engine" Sep 30 17:22:18 crc kubenswrapper[4772]: I0930 17:22:18.832342 4772 generic.go:334] "Generic (PLEG): container finished" podID="69f02322-0ff1-410e-8b46-dd3b5f909963" containerID="68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58" exitCode=1 Sep 30 17:22:18 crc kubenswrapper[4772]: I0930 17:22:18.832377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"69f02322-0ff1-410e-8b46-dd3b5f909963","Type":"ContainerDied","Data":"68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58"} Sep 30 17:22:18 crc kubenswrapper[4772]: I0930 17:22:18.832433 4772 scope.go:117] "RemoveContainer" containerID="925c4324dbfd6497d97e3bf754fbbe81ec7472b79ed9fa25fd5fffb0aaf527a6" Sep 30 17:22:18 crc kubenswrapper[4772]: I0930 17:22:18.832994 4772 scope.go:117] "RemoveContainer" containerID="68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58" Sep 30 17:22:18 crc kubenswrapper[4772]: E0930 17:22:18.833238 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(69f02322-0ff1-410e-8b46-dd3b5f909963)\"" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" Sep 30 17:22:18 crc kubenswrapper[4772]: I0930 17:22:18.835365 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ed142cd3-4d43-4293-af1f-d2a76649b5a2","Type":"ContainerStarted","Data":"0cc47f3e787195306799b6e889d9a7efdc93a5901ffd0640fe712931e6a3849b"} Sep 30 17:22:18 crc kubenswrapper[4772]: I0930 17:22:18.835702 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 17:22:18 crc kubenswrapper[4772]: I0930 17:22:18.872015 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.366691084 podStartE2EDuration="4.871997356s" podCreationTimestamp="2025-09-30 17:22:14 +0000 UTC" firstStartedPulling="2025-09-30 17:22:16.293285386 +0000 UTC m=+1237.200298217" lastFinishedPulling="2025-09-30 17:22:17.798591658 +0000 UTC m=+1238.705604489" observedRunningTime="2025-09-30 17:22:18.869692016 +0000 UTC m=+1239.776704847" watchObservedRunningTime="2025-09-30 17:22:18.871997356 +0000 UTC m=+1239.779010177" Sep 30 17:22:19 crc kubenswrapper[4772]: I0930 17:22:19.863144 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2570b35a-4d00-46e8-aca6-e8fa15d6347c","Type":"ContainerStarted","Data":"02af6e597b96c7aec152b0f6b191648205758a361c7f1d4998973276fde7f444"} Sep 30 17:22:19 crc kubenswrapper[4772]: I0930 17:22:19.864118 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2570b35a-4d00-46e8-aca6-e8fa15d6347c","Type":"ContainerStarted","Data":"8b4bdc0a3604aeed6b4865bd60c61463f7eefe95f6dd5055417c0f50438ea2b8"} Sep 30 17:22:21 crc kubenswrapper[4772]: I0930 17:22:21.884847 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2570b35a-4d00-46e8-aca6-e8fa15d6347c","Type":"ContainerStarted","Data":"d64905c89e0146a96fb885a21521a136f73fa6b45c03c0941681e78a46bd92cd"} Sep 30 17:22:21 crc kubenswrapper[4772]: I0930 17:22:21.885418 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 17:22:21 crc kubenswrapper[4772]: I0930 17:22:21.884996 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="ceilometer-central-agent" containerID="cri-o://727ff0e7a40c42d7caa98ada062b906e9238959f1937467fb3f913fb0406c5ff" gracePeriod=30 Sep 30 17:22:21 crc kubenswrapper[4772]: I0930 17:22:21.885092 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="proxy-httpd" containerID="cri-o://d64905c89e0146a96fb885a21521a136f73fa6b45c03c0941681e78a46bd92cd" gracePeriod=30 Sep 30 17:22:21 crc kubenswrapper[4772]: I0930 17:22:21.885093 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="ceilometer-notification-agent" containerID="cri-o://8b4bdc0a3604aeed6b4865bd60c61463f7eefe95f6dd5055417c0f50438ea2b8" gracePeriod=30 Sep 30 17:22:21 crc kubenswrapper[4772]: I0930 17:22:21.885050 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="sg-core" containerID="cri-o://02af6e597b96c7aec152b0f6b191648205758a361c7f1d4998973276fde7f444" gracePeriod=30 Sep 30 17:22:21 crc kubenswrapper[4772]: I0930 17:22:21.908690 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.991872054 podStartE2EDuration="6.908672254s" podCreationTimestamp="2025-09-30 17:22:15 +0000 UTC" firstStartedPulling="2025-09-30 17:22:16.019699693 +0000 UTC m=+1236.926712534" lastFinishedPulling="2025-09-30 17:22:20.936499903 +0000 UTC m=+1241.843512734" observedRunningTime="2025-09-30 17:22:21.90772049 +0000 UTC m=+1242.814733351" watchObservedRunningTime="2025-09-30 17:22:21.908672254 +0000 UTC m=+1242.815685085" Sep 30 17:22:22 crc kubenswrapper[4772]: I0930 17:22:22.896608 4772 generic.go:334] "Generic (PLEG): container finished" podID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerID="d64905c89e0146a96fb885a21521a136f73fa6b45c03c0941681e78a46bd92cd" exitCode=0 Sep 30 17:22:22 crc kubenswrapper[4772]: I0930 17:22:22.896865 4772 generic.go:334] "Generic (PLEG): container finished" podID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerID="02af6e597b96c7aec152b0f6b191648205758a361c7f1d4998973276fde7f444" exitCode=2 Sep 30 17:22:22 crc kubenswrapper[4772]: I0930 17:22:22.896874 4772 generic.go:334] "Generic (PLEG): container finished" podID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerID="8b4bdc0a3604aeed6b4865bd60c61463f7eefe95f6dd5055417c0f50438ea2b8" exitCode=0 Sep 30 17:22:22 crc kubenswrapper[4772]: I0930 17:22:22.896881 4772 generic.go:334] "Generic (PLEG): container finished" podID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerID="727ff0e7a40c42d7caa98ada062b906e9238959f1937467fb3f913fb0406c5ff" exitCode=0 Sep 30 17:22:22 crc kubenswrapper[4772]: I0930 17:22:22.896690 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2570b35a-4d00-46e8-aca6-e8fa15d6347c","Type":"ContainerDied","Data":"d64905c89e0146a96fb885a21521a136f73fa6b45c03c0941681e78a46bd92cd"} Sep 30 17:22:22 crc kubenswrapper[4772]: I0930 17:22:22.896915 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2570b35a-4d00-46e8-aca6-e8fa15d6347c","Type":"ContainerDied","Data":"02af6e597b96c7aec152b0f6b191648205758a361c7f1d4998973276fde7f444"} Sep 30 17:22:22 crc kubenswrapper[4772]: I0930 17:22:22.896928 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2570b35a-4d00-46e8-aca6-e8fa15d6347c","Type":"ContainerDied","Data":"8b4bdc0a3604aeed6b4865bd60c61463f7eefe95f6dd5055417c0f50438ea2b8"} Sep 30 17:22:22 crc kubenswrapper[4772]: I0930 17:22:22.896939 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2570b35a-4d00-46e8-aca6-e8fa15d6347c","Type":"ContainerDied","Data":"727ff0e7a40c42d7caa98ada062b906e9238959f1937467fb3f913fb0406c5ff"} Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.337277 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.466966 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-ceilometer-tls-certs\") pod \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.467174 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-combined-ca-bundle\") pod \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.467246 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6299\" (UniqueName: \"kubernetes.io/projected/2570b35a-4d00-46e8-aca6-e8fa15d6347c-kube-api-access-j6299\") pod \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.467337 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-scripts\") pod \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.467410 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-log-httpd\") pod \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.467446 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-run-httpd\") pod \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.467474 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-sg-core-conf-yaml\") pod \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.467512 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-config-data\") pod \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.468047 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2570b35a-4d00-46e8-aca6-e8fa15d6347c" (UID: "2570b35a-4d00-46e8-aca6-e8fa15d6347c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.468501 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2570b35a-4d00-46e8-aca6-e8fa15d6347c" (UID: "2570b35a-4d00-46e8-aca6-e8fa15d6347c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.474118 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2570b35a-4d00-46e8-aca6-e8fa15d6347c-kube-api-access-j6299" (OuterVolumeSpecName: "kube-api-access-j6299") pod "2570b35a-4d00-46e8-aca6-e8fa15d6347c" (UID: "2570b35a-4d00-46e8-aca6-e8fa15d6347c"). InnerVolumeSpecName "kube-api-access-j6299". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.481860 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-scripts" (OuterVolumeSpecName: "scripts") pod "2570b35a-4d00-46e8-aca6-e8fa15d6347c" (UID: "2570b35a-4d00-46e8-aca6-e8fa15d6347c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.496354 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2570b35a-4d00-46e8-aca6-e8fa15d6347c" (UID: "2570b35a-4d00-46e8-aca6-e8fa15d6347c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.516423 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2570b35a-4d00-46e8-aca6-e8fa15d6347c" (UID: "2570b35a-4d00-46e8-aca6-e8fa15d6347c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.547956 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2570b35a-4d00-46e8-aca6-e8fa15d6347c" (UID: "2570b35a-4d00-46e8-aca6-e8fa15d6347c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.568187 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-config-data" (OuterVolumeSpecName: "config-data") pod "2570b35a-4d00-46e8-aca6-e8fa15d6347c" (UID: "2570b35a-4d00-46e8-aca6-e8fa15d6347c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.568693 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-config-data\") pod \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\" (UID: \"2570b35a-4d00-46e8-aca6-e8fa15d6347c\") " Sep 30 17:22:23 crc kubenswrapper[4772]: W0930 17:22:23.568851 4772 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2570b35a-4d00-46e8-aca6-e8fa15d6347c/volumes/kubernetes.io~secret/config-data Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.568888 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-config-data" (OuterVolumeSpecName: "config-data") pod "2570b35a-4d00-46e8-aca6-e8fa15d6347c" (UID: "2570b35a-4d00-46e8-aca6-e8fa15d6347c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.569325 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6299\" (UniqueName: \"kubernetes.io/projected/2570b35a-4d00-46e8-aca6-e8fa15d6347c-kube-api-access-j6299\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.569349 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.569362 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.569371 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2570b35a-4d00-46e8-aca6-e8fa15d6347c-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.569380 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.569390 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.569400 4772 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.569408 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2570b35a-4d00-46e8-aca6-e8fa15d6347c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.907586 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.909990 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2570b35a-4d00-46e8-aca6-e8fa15d6347c","Type":"ContainerDied","Data":"368b0ae298f9cfe60257b0cc88113218ebb40381c06b42ade1397cd046675f2e"} Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.910031 4772 scope.go:117] "RemoveContainer" containerID="d64905c89e0146a96fb885a21521a136f73fa6b45c03c0941681e78a46bd92cd" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.943195 4772 scope.go:117] "RemoveContainer" containerID="02af6e597b96c7aec152b0f6b191648205758a361c7f1d4998973276fde7f444" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.961461 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.974141 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.991408 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:23 crc kubenswrapper[4772]: E0930 17:22:23.991973 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="proxy-httpd" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.991996 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="proxy-httpd" Sep 30 17:22:23 crc kubenswrapper[4772]: E0930 17:22:23.992008 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="ceilometer-notification-agent" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.992016 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="ceilometer-notification-agent" Sep 30 17:22:23 crc kubenswrapper[4772]: E0930 17:22:23.992038 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="ceilometer-central-agent" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.992046 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="ceilometer-central-agent" Sep 30 17:22:23 crc kubenswrapper[4772]: E0930 17:22:23.992077 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="sg-core" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.992086 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="sg-core" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.992298 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="ceilometer-notification-agent" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.992319 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="sg-core" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.992355 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="proxy-httpd" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.992366 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" containerName="ceilometer-central-agent" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.994387 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.998745 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.999040 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 17:22:23 crc kubenswrapper[4772]: I0930 17:22:23.999235 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.000721 4772 scope.go:117] "RemoveContainer" containerID="8b4bdc0a3604aeed6b4865bd60c61463f7eefe95f6dd5055417c0f50438ea2b8" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.003034 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.029085 4772 scope.go:117] "RemoveContainer" containerID="727ff0e7a40c42d7caa98ada062b906e9238959f1937467fb3f913fb0406c5ff" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.179171 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.179257 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrv2\" (UniqueName: \"kubernetes.io/projected/4a276efa-f7fa-4a91-8489-4be031bd1326-kube-api-access-lbrv2\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.179282 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-scripts\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.179315 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.179334 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-config-data\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.179359 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-run-httpd\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.179397 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-log-httpd\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.179441 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.280465 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.280532 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-config-data\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.280573 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-run-httpd\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.280619 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-log-httpd\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.280663 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.280774 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.280818 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrv2\" (UniqueName: \"kubernetes.io/projected/4a276efa-f7fa-4a91-8489-4be031bd1326-kube-api-access-lbrv2\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.280836 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-scripts\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.282337 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-log-httpd\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.282383 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-run-httpd\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.286087 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.286178 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.286481 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-scripts\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.287023 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.296504 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-config-data\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.298434 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrv2\" (UniqueName: \"kubernetes.io/projected/4a276efa-f7fa-4a91-8489-4be031bd1326-kube-api-access-lbrv2\") pod \"ceilometer-0\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.326474 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.769586 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:24 crc kubenswrapper[4772]: W0930 17:22:24.780259 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a276efa_f7fa_4a91_8489_4be031bd1326.slice/crio-ba5a9195c19af6353be6dcff0fe012403fe22d9718323c86d0108b661b1dfcf8 WatchSource:0}: Error finding container ba5a9195c19af6353be6dcff0fe012403fe22d9718323c86d0108b661b1dfcf8: Status 404 returned error can't find the container with id ba5a9195c19af6353be6dcff0fe012403fe22d9718323c86d0108b661b1dfcf8 Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.782704 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:22:24 crc kubenswrapper[4772]: I0930 17:22:24.922483 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a276efa-f7fa-4a91-8489-4be031bd1326","Type":"ContainerStarted","Data":"ba5a9195c19af6353be6dcff0fe012403fe22d9718323c86d0108b661b1dfcf8"} Sep 30 17:22:25 crc kubenswrapper[4772]: I0930 17:22:25.522303 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 17:22:25 crc kubenswrapper[4772]: I0930 17:22:25.908825 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2570b35a-4d00-46e8-aca6-e8fa15d6347c" path="/var/lib/kubelet/pods/2570b35a-4d00-46e8-aca6-e8fa15d6347c/volumes" Sep 30 17:22:25 crc kubenswrapper[4772]: I0930 17:22:25.938996 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a276efa-f7fa-4a91-8489-4be031bd1326","Type":"ContainerStarted","Data":"eac16b1dadeea7ee32522305d36abca308824945f74bec751a6fa34dfb7866c3"} Sep 30 17:22:25 crc kubenswrapper[4772]: I0930 17:22:25.939042 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a276efa-f7fa-4a91-8489-4be031bd1326","Type":"ContainerStarted","Data":"04b440239877ccf50ec895dcdb517c6a763cf5a4e472d33b47619717176d79fc"} Sep 30 17:22:26 crc kubenswrapper[4772]: I0930 17:22:26.949681 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a276efa-f7fa-4a91-8489-4be031bd1326","Type":"ContainerStarted","Data":"0cc28e31a723e360062dd6334b36e4624fd92409764d3b236995de8df5bbc0e7"} Sep 30 17:22:27 crc kubenswrapper[4772]: I0930 17:22:27.962450 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a276efa-f7fa-4a91-8489-4be031bd1326","Type":"ContainerStarted","Data":"d0c7ec5448909d995c08305d3d28bd109a828b4b0c22cf3b35d8bac72d7da019"} Sep 30 17:22:27 crc kubenswrapper[4772]: I0930 17:22:27.962915 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 17:22:27 crc kubenswrapper[4772]: I0930 17:22:27.989417 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.513230172 podStartE2EDuration="4.989402551s" podCreationTimestamp="2025-09-30 17:22:23 +0000 UTC" firstStartedPulling="2025-09-30 17:22:24.782436551 +0000 UTC m=+1245.689449382" lastFinishedPulling="2025-09-30 17:22:27.25860894 +0000 UTC m=+1248.165621761" observedRunningTime="2025-09-30 17:22:27.987617055 +0000 UTC m=+1248.894629886" watchObservedRunningTime="2025-09-30 17:22:27.989402551 +0000 UTC m=+1248.896415382" Sep 30 17:22:28 crc kubenswrapper[4772]: I0930 17:22:28.173089 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:28 crc kubenswrapper[4772]: I0930 17:22:28.533129 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 17:22:28 crc kubenswrapper[4772]: I0930 17:22:28.534312 4772 scope.go:117] "RemoveContainer" containerID="68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58" Sep 30 17:22:28 crc kubenswrapper[4772]: E0930 17:22:28.534581 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(69f02322-0ff1-410e-8b46-dd3b5f909963)\"" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" Sep 30 17:22:29 crc kubenswrapper[4772]: I0930 17:22:29.980615 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="proxy-httpd" containerID="cri-o://d0c7ec5448909d995c08305d3d28bd109a828b4b0c22cf3b35d8bac72d7da019" gracePeriod=30 Sep 30 17:22:29 crc kubenswrapper[4772]: I0930 17:22:29.980631 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="sg-core" containerID="cri-o://0cc28e31a723e360062dd6334b36e4624fd92409764d3b236995de8df5bbc0e7" gracePeriod=30 Sep 30 17:22:29 crc kubenswrapper[4772]: I0930 17:22:29.981046 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="ceilometer-central-agent" containerID="cri-o://04b440239877ccf50ec895dcdb517c6a763cf5a4e472d33b47619717176d79fc" gracePeriod=30 Sep 30 17:22:29 crc kubenswrapper[4772]: I0930 17:22:29.980653 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="ceilometer-notification-agent" containerID="cri-o://eac16b1dadeea7ee32522305d36abca308824945f74bec751a6fa34dfb7866c3" gracePeriod=30 Sep 30 17:22:30 crc kubenswrapper[4772]: I0930 17:22:30.995541 4772 generic.go:334] "Generic (PLEG): container finished" podID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerID="d0c7ec5448909d995c08305d3d28bd109a828b4b0c22cf3b35d8bac72d7da019" exitCode=0 Sep 30 17:22:30 crc kubenswrapper[4772]: I0930 17:22:30.996004 4772 generic.go:334] "Generic (PLEG): container finished" podID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerID="0cc28e31a723e360062dd6334b36e4624fd92409764d3b236995de8df5bbc0e7" exitCode=2 Sep 30 17:22:30 crc kubenswrapper[4772]: I0930 17:22:30.996018 4772 generic.go:334] "Generic (PLEG): container finished" podID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerID="eac16b1dadeea7ee32522305d36abca308824945f74bec751a6fa34dfb7866c3" exitCode=0 Sep 30 17:22:30 crc kubenswrapper[4772]: I0930 17:22:30.995610 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a276efa-f7fa-4a91-8489-4be031bd1326","Type":"ContainerDied","Data":"d0c7ec5448909d995c08305d3d28bd109a828b4b0c22cf3b35d8bac72d7da019"} Sep 30 17:22:30 crc kubenswrapper[4772]: I0930 17:22:30.996089 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a276efa-f7fa-4a91-8489-4be031bd1326","Type":"ContainerDied","Data":"0cc28e31a723e360062dd6334b36e4624fd92409764d3b236995de8df5bbc0e7"} Sep 30 17:22:30 crc kubenswrapper[4772]: I0930 17:22:30.996109 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a276efa-f7fa-4a91-8489-4be031bd1326","Type":"ContainerDied","Data":"eac16b1dadeea7ee32522305d36abca308824945f74bec751a6fa34dfb7866c3"} Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.020786 4772 generic.go:334] "Generic (PLEG): container finished" podID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerID="04b440239877ccf50ec895dcdb517c6a763cf5a4e472d33b47619717176d79fc" exitCode=0 Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.020952 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a276efa-f7fa-4a91-8489-4be031bd1326","Type":"ContainerDied","Data":"04b440239877ccf50ec895dcdb517c6a763cf5a4e472d33b47619717176d79fc"} Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.584722 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.683280 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-log-httpd\") pod \"4a276efa-f7fa-4a91-8489-4be031bd1326\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.683354 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-sg-core-conf-yaml\") pod \"4a276efa-f7fa-4a91-8489-4be031bd1326\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.683380 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-config-data\") pod \"4a276efa-f7fa-4a91-8489-4be031bd1326\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.683492 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-combined-ca-bundle\") pod \"4a276efa-f7fa-4a91-8489-4be031bd1326\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.683533 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbrv2\" (UniqueName: \"kubernetes.io/projected/4a276efa-f7fa-4a91-8489-4be031bd1326-kube-api-access-lbrv2\") pod \"4a276efa-f7fa-4a91-8489-4be031bd1326\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.683589 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-scripts\") pod \"4a276efa-f7fa-4a91-8489-4be031bd1326\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.683622 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-run-httpd\") pod \"4a276efa-f7fa-4a91-8489-4be031bd1326\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.684048 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4a276efa-f7fa-4a91-8489-4be031bd1326" (UID: "4a276efa-f7fa-4a91-8489-4be031bd1326"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.684669 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4a276efa-f7fa-4a91-8489-4be031bd1326" (UID: "4a276efa-f7fa-4a91-8489-4be031bd1326"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.684730 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-ceilometer-tls-certs\") pod \"4a276efa-f7fa-4a91-8489-4be031bd1326\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.685325 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.685350 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a276efa-f7fa-4a91-8489-4be031bd1326-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.693606 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-scripts" (OuterVolumeSpecName: "scripts") pod "4a276efa-f7fa-4a91-8489-4be031bd1326" (UID: "4a276efa-f7fa-4a91-8489-4be031bd1326"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.693658 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a276efa-f7fa-4a91-8489-4be031bd1326-kube-api-access-lbrv2" (OuterVolumeSpecName: "kube-api-access-lbrv2") pod "4a276efa-f7fa-4a91-8489-4be031bd1326" (UID: "4a276efa-f7fa-4a91-8489-4be031bd1326"). InnerVolumeSpecName "kube-api-access-lbrv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.720461 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4a276efa-f7fa-4a91-8489-4be031bd1326" (UID: "4a276efa-f7fa-4a91-8489-4be031bd1326"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.747353 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4a276efa-f7fa-4a91-8489-4be031bd1326" (UID: "4a276efa-f7fa-4a91-8489-4be031bd1326"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.785857 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a276efa-f7fa-4a91-8489-4be031bd1326" (UID: "4a276efa-f7fa-4a91-8489-4be031bd1326"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.786210 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-combined-ca-bundle\") pod \"4a276efa-f7fa-4a91-8489-4be031bd1326\" (UID: \"4a276efa-f7fa-4a91-8489-4be031bd1326\") " Sep 30 17:22:33 crc kubenswrapper[4772]: W0930 17:22:33.786352 4772 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4a276efa-f7fa-4a91-8489-4be031bd1326/volumes/kubernetes.io~secret/combined-ca-bundle Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.786373 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a276efa-f7fa-4a91-8489-4be031bd1326" (UID: "4a276efa-f7fa-4a91-8489-4be031bd1326"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.786683 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.786710 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbrv2\" (UniqueName: \"kubernetes.io/projected/4a276efa-f7fa-4a91-8489-4be031bd1326-kube-api-access-lbrv2\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.786724 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.786736 4772 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.786746 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.812955 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-config-data" (OuterVolumeSpecName: "config-data") pod "4a276efa-f7fa-4a91-8489-4be031bd1326" (UID: "4a276efa-f7fa-4a91-8489-4be031bd1326"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:33 crc kubenswrapper[4772]: I0930 17:22:33.888794 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a276efa-f7fa-4a91-8489-4be031bd1326-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.036594 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a276efa-f7fa-4a91-8489-4be031bd1326","Type":"ContainerDied","Data":"ba5a9195c19af6353be6dcff0fe012403fe22d9718323c86d0108b661b1dfcf8"} Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.036669 4772 scope.go:117] "RemoveContainer" containerID="d0c7ec5448909d995c08305d3d28bd109a828b4b0c22cf3b35d8bac72d7da019" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.036882 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.106117 4772 scope.go:117] "RemoveContainer" containerID="0cc28e31a723e360062dd6334b36e4624fd92409764d3b236995de8df5bbc0e7" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.110187 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.128549 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.146452 4772 scope.go:117] "RemoveContainer" containerID="eac16b1dadeea7ee32522305d36abca308824945f74bec751a6fa34dfb7866c3" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.156819 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:34 crc kubenswrapper[4772]: E0930 17:22:34.157332 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="sg-core" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.157351 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="sg-core" Sep 30 17:22:34 crc kubenswrapper[4772]: E0930 17:22:34.157369 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="ceilometer-central-agent" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.157376 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="ceilometer-central-agent" Sep 30 17:22:34 crc kubenswrapper[4772]: E0930 17:22:34.157392 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="ceilometer-notification-agent" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.157398 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="ceilometer-notification-agent" Sep 30 17:22:34 crc kubenswrapper[4772]: E0930 17:22:34.157409 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="proxy-httpd" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.157415 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="proxy-httpd" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.157593 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="proxy-httpd" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.157616 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="ceilometer-central-agent" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.157631 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="sg-core" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.157644 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" containerName="ceilometer-notification-agent" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.161130 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.164206 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.164409 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.165695 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.171825 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.178992 4772 scope.go:117] "RemoveContainer" containerID="04b440239877ccf50ec895dcdb517c6a763cf5a4e472d33b47619717176d79fc" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.312492 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.313104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-log-httpd\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.313223 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.313328 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-run-httpd\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.313520 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-scripts\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.313645 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-config-data\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.313762 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.313885 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvxh7\" (UniqueName: \"kubernetes.io/projected/2d22208c-366b-4618-87a8-a0b3492710e2-kube-api-access-wvxh7\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.415116 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-scripts\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.415180 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-config-data\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.415220 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.415258 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvxh7\" (UniqueName: \"kubernetes.io/projected/2d22208c-366b-4618-87a8-a0b3492710e2-kube-api-access-wvxh7\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.415282 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.415311 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-log-httpd\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.415346 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.415370 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-run-httpd\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.415975 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-run-httpd\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.416136 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-log-httpd\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.420173 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-scripts\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.420556 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.421287 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.424361 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.433903 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-config-data\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.436604 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvxh7\" (UniqueName: \"kubernetes.io/projected/2d22208c-366b-4618-87a8-a0b3492710e2-kube-api-access-wvxh7\") pod \"ceilometer-0\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.477252 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:34 crc kubenswrapper[4772]: I0930 17:22:34.953342 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:35 crc kubenswrapper[4772]: I0930 17:22:35.045325 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d22208c-366b-4618-87a8-a0b3492710e2","Type":"ContainerStarted","Data":"b43ff6ddcda835a6dc65f71a73c2b617ba4c6c7e51d07bbb0ff41583431d5913"} Sep 30 17:22:35 crc kubenswrapper[4772]: I0930 17:22:35.487622 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:35 crc kubenswrapper[4772]: I0930 17:22:35.908599 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a276efa-f7fa-4a91-8489-4be031bd1326" path="/var/lib/kubelet/pods/4a276efa-f7fa-4a91-8489-4be031bd1326/volumes" Sep 30 17:22:35 crc kubenswrapper[4772]: I0930 17:22:35.923311 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-bnzg4"] Sep 30 17:22:35 crc kubenswrapper[4772]: I0930 17:22:35.925600 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bnzg4" Sep 30 17:22:35 crc kubenswrapper[4772]: I0930 17:22:35.943497 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bnzg4"] Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.029534 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vqdkk"] Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.031363 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vqdkk" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.045251 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dswg\" (UniqueName: \"kubernetes.io/projected/4018ba50-6562-42a4-ba6a-70d499df4c43-kube-api-access-6dswg\") pod \"nova-api-db-create-bnzg4\" (UID: \"4018ba50-6562-42a4-ba6a-70d499df4c43\") " pod="openstack/nova-api-db-create-bnzg4" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.057132 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vqdkk"] Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.066750 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d22208c-366b-4618-87a8-a0b3492710e2","Type":"ContainerStarted","Data":"f74659c6239dadec03be2b2a6fb15792152684f829e37077d2dce03cefd13dcc"} Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.150231 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dswg\" (UniqueName: \"kubernetes.io/projected/4018ba50-6562-42a4-ba6a-70d499df4c43-kube-api-access-6dswg\") pod \"nova-api-db-create-bnzg4\" (UID: \"4018ba50-6562-42a4-ba6a-70d499df4c43\") " pod="openstack/nova-api-db-create-bnzg4" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.150573 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5frdp\" (UniqueName: \"kubernetes.io/projected/82ffc0b4-4f98-45c5-b395-ee9defa7f57d-kube-api-access-5frdp\") pod \"nova-cell0-db-create-vqdkk\" (UID: \"82ffc0b4-4f98-45c5-b395-ee9defa7f57d\") " pod="openstack/nova-cell0-db-create-vqdkk" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.199830 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dswg\" (UniqueName: \"kubernetes.io/projected/4018ba50-6562-42a4-ba6a-70d499df4c43-kube-api-access-6dswg\") pod \"nova-api-db-create-bnzg4\" (UID: \"4018ba50-6562-42a4-ba6a-70d499df4c43\") " pod="openstack/nova-api-db-create-bnzg4" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.253426 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5frdp\" (UniqueName: \"kubernetes.io/projected/82ffc0b4-4f98-45c5-b395-ee9defa7f57d-kube-api-access-5frdp\") pod \"nova-cell0-db-create-vqdkk\" (UID: \"82ffc0b4-4f98-45c5-b395-ee9defa7f57d\") " pod="openstack/nova-cell0-db-create-vqdkk" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.260541 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bnzg4" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.268256 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-r8bf5"] Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.269802 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-r8bf5" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.294539 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5frdp\" (UniqueName: \"kubernetes.io/projected/82ffc0b4-4f98-45c5-b395-ee9defa7f57d-kube-api-access-5frdp\") pod \"nova-cell0-db-create-vqdkk\" (UID: \"82ffc0b4-4f98-45c5-b395-ee9defa7f57d\") " pod="openstack/nova-cell0-db-create-vqdkk" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.299594 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-r8bf5"] Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.355216 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vqdkk" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.462569 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vh8\" (UniqueName: \"kubernetes.io/projected/b65e356d-bb3c-4d60-b7f1-9b29c648351e-kube-api-access-p6vh8\") pod \"nova-cell1-db-create-r8bf5\" (UID: \"b65e356d-bb3c-4d60-b7f1-9b29c648351e\") " pod="openstack/nova-cell1-db-create-r8bf5" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.565159 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vh8\" (UniqueName: \"kubernetes.io/projected/b65e356d-bb3c-4d60-b7f1-9b29c648351e-kube-api-access-p6vh8\") pod \"nova-cell1-db-create-r8bf5\" (UID: \"b65e356d-bb3c-4d60-b7f1-9b29c648351e\") " pod="openstack/nova-cell1-db-create-r8bf5" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.581922 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vh8\" (UniqueName: \"kubernetes.io/projected/b65e356d-bb3c-4d60-b7f1-9b29c648351e-kube-api-access-p6vh8\") pod \"nova-cell1-db-create-r8bf5\" (UID: \"b65e356d-bb3c-4d60-b7f1-9b29c648351e\") " pod="openstack/nova-cell1-db-create-r8bf5" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.757607 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-r8bf5" Sep 30 17:22:36 crc kubenswrapper[4772]: I0930 17:22:36.925017 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bnzg4"] Sep 30 17:22:37 crc kubenswrapper[4772]: I0930 17:22:37.081416 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d22208c-366b-4618-87a8-a0b3492710e2","Type":"ContainerStarted","Data":"3b670a8467b2097376007c54c90c98173986781b5dc3ad7a4db6c83394bc519b"} Sep 30 17:22:37 crc kubenswrapper[4772]: I0930 17:22:37.086663 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bnzg4" event={"ID":"4018ba50-6562-42a4-ba6a-70d499df4c43","Type":"ContainerStarted","Data":"1dfd57ce816d2bbdf2ae36000221a541c36ab9ea26978c490d743f47a1903fef"} Sep 30 17:22:37 crc kubenswrapper[4772]: I0930 17:22:37.167244 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vqdkk"] Sep 30 17:22:37 crc kubenswrapper[4772]: W0930 17:22:37.312383 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb65e356d_bb3c_4d60_b7f1_9b29c648351e.slice/crio-1945938095ea5856ea0309b4c098cb9be34c5562b4630eab8617884bd9544d2f WatchSource:0}: Error finding container 1945938095ea5856ea0309b4c098cb9be34c5562b4630eab8617884bd9544d2f: Status 404 returned error can't find the container with id 1945938095ea5856ea0309b4c098cb9be34c5562b4630eab8617884bd9544d2f Sep 30 17:22:37 crc kubenswrapper[4772]: I0930 17:22:37.313218 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-r8bf5"] Sep 30 17:22:38 crc kubenswrapper[4772]: I0930 17:22:38.100900 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-r8bf5" event={"ID":"b65e356d-bb3c-4d60-b7f1-9b29c648351e","Type":"ContainerStarted","Data":"898b1f9cdb8d3d1669bcfb246cd68e332f98c52278b79a3263da98a9c1ea36c5"} Sep 30 17:22:38 crc kubenswrapper[4772]: I0930 17:22:38.100945 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-r8bf5" event={"ID":"b65e356d-bb3c-4d60-b7f1-9b29c648351e","Type":"ContainerStarted","Data":"1945938095ea5856ea0309b4c098cb9be34c5562b4630eab8617884bd9544d2f"} Sep 30 17:22:38 crc kubenswrapper[4772]: I0930 17:22:38.104086 4772 generic.go:334] "Generic (PLEG): container finished" podID="82ffc0b4-4f98-45c5-b395-ee9defa7f57d" containerID="11be6602a2c00ad08b0a3d8ca8a9c49225fb5060588b17cc294ecb1c8af35dea" exitCode=0 Sep 30 17:22:38 crc kubenswrapper[4772]: I0930 17:22:38.104148 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vqdkk" event={"ID":"82ffc0b4-4f98-45c5-b395-ee9defa7f57d","Type":"ContainerDied","Data":"11be6602a2c00ad08b0a3d8ca8a9c49225fb5060588b17cc294ecb1c8af35dea"} Sep 30 17:22:38 crc kubenswrapper[4772]: I0930 17:22:38.104168 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vqdkk" event={"ID":"82ffc0b4-4f98-45c5-b395-ee9defa7f57d","Type":"ContainerStarted","Data":"c3770539902a16e74e3e39560dedfd4186d992d94329372061781b7b9d728d3f"} Sep 30 17:22:38 crc kubenswrapper[4772]: I0930 17:22:38.108907 4772 generic.go:334] "Generic (PLEG): container finished" podID="4018ba50-6562-42a4-ba6a-70d499df4c43" containerID="e0bf65f0c7f3dfa3bd2d41991afa2b8dc55b2ce58fa62e7a966c8b3e6a359764" exitCode=0 Sep 30 17:22:38 crc kubenswrapper[4772]: I0930 17:22:38.108955 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bnzg4" event={"ID":"4018ba50-6562-42a4-ba6a-70d499df4c43","Type":"ContainerDied","Data":"e0bf65f0c7f3dfa3bd2d41991afa2b8dc55b2ce58fa62e7a966c8b3e6a359764"} Sep 30 17:22:38 crc kubenswrapper[4772]: I0930 17:22:38.133489 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-r8bf5" podStartSLOduration=2.133469715 podStartE2EDuration="2.133469715s" podCreationTimestamp="2025-09-30 17:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:22:38.118288923 +0000 UTC m=+1259.025301754" watchObservedRunningTime="2025-09-30 17:22:38.133469715 +0000 UTC m=+1259.040482536" Sep 30 17:22:39 crc kubenswrapper[4772]: I0930 17:22:39.118167 4772 generic.go:334] "Generic (PLEG): container finished" podID="b65e356d-bb3c-4d60-b7f1-9b29c648351e" containerID="898b1f9cdb8d3d1669bcfb246cd68e332f98c52278b79a3263da98a9c1ea36c5" exitCode=0 Sep 30 17:22:39 crc kubenswrapper[4772]: I0930 17:22:39.118358 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-r8bf5" event={"ID":"b65e356d-bb3c-4d60-b7f1-9b29c648351e","Type":"ContainerDied","Data":"898b1f9cdb8d3d1669bcfb246cd68e332f98c52278b79a3263da98a9c1ea36c5"} Sep 30 17:22:39 crc kubenswrapper[4772]: I0930 17:22:39.703412 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bnzg4" Sep 30 17:22:39 crc kubenswrapper[4772]: I0930 17:22:39.710529 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vqdkk" Sep 30 17:22:39 crc kubenswrapper[4772]: I0930 17:22:39.865937 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dswg\" (UniqueName: \"kubernetes.io/projected/4018ba50-6562-42a4-ba6a-70d499df4c43-kube-api-access-6dswg\") pod \"4018ba50-6562-42a4-ba6a-70d499df4c43\" (UID: \"4018ba50-6562-42a4-ba6a-70d499df4c43\") " Sep 30 17:22:39 crc kubenswrapper[4772]: I0930 17:22:39.865995 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5frdp\" (UniqueName: \"kubernetes.io/projected/82ffc0b4-4f98-45c5-b395-ee9defa7f57d-kube-api-access-5frdp\") pod \"82ffc0b4-4f98-45c5-b395-ee9defa7f57d\" (UID: \"82ffc0b4-4f98-45c5-b395-ee9defa7f57d\") " Sep 30 17:22:39 crc kubenswrapper[4772]: I0930 17:22:39.886190 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82ffc0b4-4f98-45c5-b395-ee9defa7f57d-kube-api-access-5frdp" (OuterVolumeSpecName: "kube-api-access-5frdp") pod "82ffc0b4-4f98-45c5-b395-ee9defa7f57d" (UID: "82ffc0b4-4f98-45c5-b395-ee9defa7f57d"). InnerVolumeSpecName "kube-api-access-5frdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:39 crc kubenswrapper[4772]: I0930 17:22:39.899317 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4018ba50-6562-42a4-ba6a-70d499df4c43-kube-api-access-6dswg" (OuterVolumeSpecName: "kube-api-access-6dswg") pod "4018ba50-6562-42a4-ba6a-70d499df4c43" (UID: "4018ba50-6562-42a4-ba6a-70d499df4c43"). InnerVolumeSpecName "kube-api-access-6dswg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:39 crc kubenswrapper[4772]: I0930 17:22:39.970515 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dswg\" (UniqueName: \"kubernetes.io/projected/4018ba50-6562-42a4-ba6a-70d499df4c43-kube-api-access-6dswg\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:39 crc kubenswrapper[4772]: I0930 17:22:39.970561 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5frdp\" (UniqueName: \"kubernetes.io/projected/82ffc0b4-4f98-45c5-b395-ee9defa7f57d-kube-api-access-5frdp\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.127381 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d22208c-366b-4618-87a8-a0b3492710e2","Type":"ContainerStarted","Data":"bca4ee58f73ca5ab2167f49913353850bad11a9123818925df4f466bca92d1fc"} Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.129495 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bnzg4" event={"ID":"4018ba50-6562-42a4-ba6a-70d499df4c43","Type":"ContainerDied","Data":"1dfd57ce816d2bbdf2ae36000221a541c36ab9ea26978c490d743f47a1903fef"} Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.129526 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dfd57ce816d2bbdf2ae36000221a541c36ab9ea26978c490d743f47a1903fef" Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.129579 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bnzg4" Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.133844 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vqdkk" Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.133903 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vqdkk" event={"ID":"82ffc0b4-4f98-45c5-b395-ee9defa7f57d","Type":"ContainerDied","Data":"c3770539902a16e74e3e39560dedfd4186d992d94329372061781b7b9d728d3f"} Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.133928 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3770539902a16e74e3e39560dedfd4186d992d94329372061781b7b9d728d3f" Sep 30 17:22:40 crc kubenswrapper[4772]: E0930 17:22:40.185480 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice/crio-1dfd57ce816d2bbdf2ae36000221a541c36ab9ea26978c490d743f47a1903fef\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice\": RecentStats: unable to find data in memory cache]" Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.509544 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-r8bf5" Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.683855 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6vh8\" (UniqueName: \"kubernetes.io/projected/b65e356d-bb3c-4d60-b7f1-9b29c648351e-kube-api-access-p6vh8\") pod \"b65e356d-bb3c-4d60-b7f1-9b29c648351e\" (UID: \"b65e356d-bb3c-4d60-b7f1-9b29c648351e\") " Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.690907 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65e356d-bb3c-4d60-b7f1-9b29c648351e-kube-api-access-p6vh8" (OuterVolumeSpecName: "kube-api-access-p6vh8") pod "b65e356d-bb3c-4d60-b7f1-9b29c648351e" (UID: "b65e356d-bb3c-4d60-b7f1-9b29c648351e"). InnerVolumeSpecName "kube-api-access-p6vh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:40 crc kubenswrapper[4772]: I0930 17:22:40.785986 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6vh8\" (UniqueName: \"kubernetes.io/projected/b65e356d-bb3c-4d60-b7f1-9b29c648351e-kube-api-access-p6vh8\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:41 crc kubenswrapper[4772]: I0930 17:22:41.144545 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-r8bf5" event={"ID":"b65e356d-bb3c-4d60-b7f1-9b29c648351e","Type":"ContainerDied","Data":"1945938095ea5856ea0309b4c098cb9be34c5562b4630eab8617884bd9544d2f"} Sep 30 17:22:41 crc kubenswrapper[4772]: I0930 17:22:41.144570 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-r8bf5" Sep 30 17:22:41 crc kubenswrapper[4772]: I0930 17:22:41.149828 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1945938095ea5856ea0309b4c098cb9be34c5562b4630eab8617884bd9544d2f" Sep 30 17:22:41 crc kubenswrapper[4772]: I0930 17:22:41.898929 4772 scope.go:117] "RemoveContainer" containerID="68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58" Sep 30 17:22:41 crc kubenswrapper[4772]: E0930 17:22:41.899731 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(69f02322-0ff1-410e-8b46-dd3b5f909963)\"" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" Sep 30 17:22:42 crc kubenswrapper[4772]: I0930 17:22:42.160503 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d22208c-366b-4618-87a8-a0b3492710e2","Type":"ContainerStarted","Data":"7a422bf6341d4d3c20aa63a7b634c9736ed45db355e17edaa53c301b5ea224f3"} Sep 30 17:22:42 crc kubenswrapper[4772]: I0930 17:22:42.160726 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="ceilometer-central-agent" containerID="cri-o://f74659c6239dadec03be2b2a6fb15792152684f829e37077d2dce03cefd13dcc" gracePeriod=30 Sep 30 17:22:42 crc kubenswrapper[4772]: I0930 17:22:42.160790 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="sg-core" containerID="cri-o://bca4ee58f73ca5ab2167f49913353850bad11a9123818925df4f466bca92d1fc" gracePeriod=30 Sep 30 17:22:42 crc kubenswrapper[4772]: I0930 17:22:42.160890 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="ceilometer-notification-agent" containerID="cri-o://3b670a8467b2097376007c54c90c98173986781b5dc3ad7a4db6c83394bc519b" gracePeriod=30 Sep 30 17:22:42 crc kubenswrapper[4772]: I0930 17:22:42.160966 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="proxy-httpd" containerID="cri-o://7a422bf6341d4d3c20aa63a7b634c9736ed45db355e17edaa53c301b5ea224f3" gracePeriod=30 Sep 30 17:22:42 crc kubenswrapper[4772]: I0930 17:22:42.161968 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 17:22:42 crc kubenswrapper[4772]: I0930 17:22:42.202764 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.908717727 podStartE2EDuration="8.202742936s" podCreationTimestamp="2025-09-30 17:22:34 +0000 UTC" firstStartedPulling="2025-09-30 17:22:34.959861208 +0000 UTC m=+1255.866874039" lastFinishedPulling="2025-09-30 17:22:41.253886417 +0000 UTC m=+1262.160899248" observedRunningTime="2025-09-30 17:22:42.192658205 +0000 UTC m=+1263.099671036" watchObservedRunningTime="2025-09-30 17:22:42.202742936 +0000 UTC m=+1263.109755767" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.173615 4772 generic.go:334] "Generic (PLEG): container finished" podID="2d22208c-366b-4618-87a8-a0b3492710e2" containerID="7a422bf6341d4d3c20aa63a7b634c9736ed45db355e17edaa53c301b5ea224f3" exitCode=0 Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.173901 4772 generic.go:334] "Generic (PLEG): container finished" podID="2d22208c-366b-4618-87a8-a0b3492710e2" containerID="bca4ee58f73ca5ab2167f49913353850bad11a9123818925df4f466bca92d1fc" exitCode=2 Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.173912 4772 generic.go:334] "Generic (PLEG): container finished" podID="2d22208c-366b-4618-87a8-a0b3492710e2" containerID="3b670a8467b2097376007c54c90c98173986781b5dc3ad7a4db6c83394bc519b" exitCode=0 Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.173919 4772 generic.go:334] "Generic (PLEG): container finished" podID="2d22208c-366b-4618-87a8-a0b3492710e2" containerID="f74659c6239dadec03be2b2a6fb15792152684f829e37077d2dce03cefd13dcc" exitCode=0 Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.173729 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d22208c-366b-4618-87a8-a0b3492710e2","Type":"ContainerDied","Data":"7a422bf6341d4d3c20aa63a7b634c9736ed45db355e17edaa53c301b5ea224f3"} Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.173957 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d22208c-366b-4618-87a8-a0b3492710e2","Type":"ContainerDied","Data":"bca4ee58f73ca5ab2167f49913353850bad11a9123818925df4f466bca92d1fc"} Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.173973 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d22208c-366b-4618-87a8-a0b3492710e2","Type":"ContainerDied","Data":"3b670a8467b2097376007c54c90c98173986781b5dc3ad7a4db6c83394bc519b"} Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.173983 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d22208c-366b-4618-87a8-a0b3492710e2","Type":"ContainerDied","Data":"f74659c6239dadec03be2b2a6fb15792152684f829e37077d2dce03cefd13dcc"} Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.400483 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.534614 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-ceilometer-tls-certs\") pod \"2d22208c-366b-4618-87a8-a0b3492710e2\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.534944 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-run-httpd\") pod \"2d22208c-366b-4618-87a8-a0b3492710e2\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.535433 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2d22208c-366b-4618-87a8-a0b3492710e2" (UID: "2d22208c-366b-4618-87a8-a0b3492710e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.535500 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-sg-core-conf-yaml\") pod \"2d22208c-366b-4618-87a8-a0b3492710e2\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.535811 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-log-httpd\") pod \"2d22208c-366b-4618-87a8-a0b3492710e2\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.535858 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-config-data\") pod \"2d22208c-366b-4618-87a8-a0b3492710e2\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.535902 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvxh7\" (UniqueName: \"kubernetes.io/projected/2d22208c-366b-4618-87a8-a0b3492710e2-kube-api-access-wvxh7\") pod \"2d22208c-366b-4618-87a8-a0b3492710e2\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.535931 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-scripts\") pod \"2d22208c-366b-4618-87a8-a0b3492710e2\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.535978 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-combined-ca-bundle\") pod \"2d22208c-366b-4618-87a8-a0b3492710e2\" (UID: \"2d22208c-366b-4618-87a8-a0b3492710e2\") " Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.536339 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2d22208c-366b-4618-87a8-a0b3492710e2" (UID: "2d22208c-366b-4618-87a8-a0b3492710e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.536830 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.536851 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d22208c-366b-4618-87a8-a0b3492710e2-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.541106 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-scripts" (OuterVolumeSpecName: "scripts") pod "2d22208c-366b-4618-87a8-a0b3492710e2" (UID: "2d22208c-366b-4618-87a8-a0b3492710e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.541513 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d22208c-366b-4618-87a8-a0b3492710e2-kube-api-access-wvxh7" (OuterVolumeSpecName: "kube-api-access-wvxh7") pod "2d22208c-366b-4618-87a8-a0b3492710e2" (UID: "2d22208c-366b-4618-87a8-a0b3492710e2"). InnerVolumeSpecName "kube-api-access-wvxh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.563373 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2d22208c-366b-4618-87a8-a0b3492710e2" (UID: "2d22208c-366b-4618-87a8-a0b3492710e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.587855 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2d22208c-366b-4618-87a8-a0b3492710e2" (UID: "2d22208c-366b-4618-87a8-a0b3492710e2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.620074 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d22208c-366b-4618-87a8-a0b3492710e2" (UID: "2d22208c-366b-4618-87a8-a0b3492710e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.638850 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.638889 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvxh7\" (UniqueName: \"kubernetes.io/projected/2d22208c-366b-4618-87a8-a0b3492710e2-kube-api-access-wvxh7\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.638901 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.638910 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.638919 4772 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.641434 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-config-data" (OuterVolumeSpecName: "config-data") pod "2d22208c-366b-4618-87a8-a0b3492710e2" (UID: "2d22208c-366b-4618-87a8-a0b3492710e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:43 crc kubenswrapper[4772]: I0930 17:22:43.740499 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d22208c-366b-4618-87a8-a0b3492710e2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.197198 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d22208c-366b-4618-87a8-a0b3492710e2","Type":"ContainerDied","Data":"b43ff6ddcda835a6dc65f71a73c2b617ba4c6c7e51d07bbb0ff41583431d5913"} Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.197255 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.197260 4772 scope.go:117] "RemoveContainer" containerID="7a422bf6341d4d3c20aa63a7b634c9736ed45db355e17edaa53c301b5ea224f3" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.263315 4772 scope.go:117] "RemoveContainer" containerID="bca4ee58f73ca5ab2167f49913353850bad11a9123818925df4f466bca92d1fc" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.293979 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.343305 4772 scope.go:117] "RemoveContainer" containerID="3b670a8467b2097376007c54c90c98173986781b5dc3ad7a4db6c83394bc519b" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.384425 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.433217 4772 scope.go:117] "RemoveContainer" containerID="f74659c6239dadec03be2b2a6fb15792152684f829e37077d2dce03cefd13dcc" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.434644 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:44 crc kubenswrapper[4772]: E0930 17:22:44.435105 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65e356d-bb3c-4d60-b7f1-9b29c648351e" containerName="mariadb-database-create" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435126 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65e356d-bb3c-4d60-b7f1-9b29c648351e" containerName="mariadb-database-create" Sep 30 17:22:44 crc kubenswrapper[4772]: E0930 17:22:44.435141 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="ceilometer-notification-agent" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435147 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="ceilometer-notification-agent" Sep 30 17:22:44 crc kubenswrapper[4772]: E0930 17:22:44.435154 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82ffc0b4-4f98-45c5-b395-ee9defa7f57d" containerName="mariadb-database-create" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435160 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ffc0b4-4f98-45c5-b395-ee9defa7f57d" containerName="mariadb-database-create" Sep 30 17:22:44 crc kubenswrapper[4772]: E0930 17:22:44.435175 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="proxy-httpd" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435182 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="proxy-httpd" Sep 30 17:22:44 crc kubenswrapper[4772]: E0930 17:22:44.435194 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4018ba50-6562-42a4-ba6a-70d499df4c43" containerName="mariadb-database-create" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435199 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4018ba50-6562-42a4-ba6a-70d499df4c43" containerName="mariadb-database-create" Sep 30 17:22:44 crc kubenswrapper[4772]: E0930 17:22:44.435209 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="ceilometer-central-agent" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435215 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="ceilometer-central-agent" Sep 30 17:22:44 crc kubenswrapper[4772]: E0930 17:22:44.435228 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="sg-core" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435233 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="sg-core" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435414 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="82ffc0b4-4f98-45c5-b395-ee9defa7f57d" containerName="mariadb-database-create" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435428 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="sg-core" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435446 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="ceilometer-notification-agent" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435455 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4018ba50-6562-42a4-ba6a-70d499df4c43" containerName="mariadb-database-create" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435466 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="ceilometer-central-agent" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435494 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65e356d-bb3c-4d60-b7f1-9b29c648351e" containerName="mariadb-database-create" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.435504 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" containerName="proxy-httpd" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.439717 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.442901 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.447242 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.448547 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.448910 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.496039 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-config-data\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.496580 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.496621 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-run-httpd\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.496645 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-scripts\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.496662 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.496692 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-log-httpd\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.496716 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9df8v\" (UniqueName: \"kubernetes.io/projected/f43550b7-9e60-4018-99bb-c1ef5c05b022-kube-api-access-9df8v\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.496738 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.598624 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.598687 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-run-httpd\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.598718 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-scripts\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.598740 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.598783 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-log-httpd\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.598816 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9df8v\" (UniqueName: \"kubernetes.io/projected/f43550b7-9e60-4018-99bb-c1ef5c05b022-kube-api-access-9df8v\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.598856 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.598899 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-config-data\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.599261 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-run-httpd\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.599521 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-log-httpd\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.608841 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-scripts\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.609564 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.610328 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.610909 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.616030 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-config-data\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.619037 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9df8v\" (UniqueName: \"kubernetes.io/projected/f43550b7-9e60-4018-99bb-c1ef5c05b022-kube-api-access-9df8v\") pod \"ceilometer-0\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " pod="openstack/ceilometer-0" Sep 30 17:22:44 crc kubenswrapper[4772]: I0930 17:22:44.777778 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:22:45 crc kubenswrapper[4772]: I0930 17:22:45.303493 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:22:45 crc kubenswrapper[4772]: W0930 17:22:45.306071 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf43550b7_9e60_4018_99bb_c1ef5c05b022.slice/crio-776a8745bcd7081f1e83fa71734cc8d1d2d908537ac54ab250c45ae4641a877c WatchSource:0}: Error finding container 776a8745bcd7081f1e83fa71734cc8d1d2d908537ac54ab250c45ae4641a877c: Status 404 returned error can't find the container with id 776a8745bcd7081f1e83fa71734cc8d1d2d908537ac54ab250c45ae4641a877c Sep 30 17:22:45 crc kubenswrapper[4772]: I0930 17:22:45.911909 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d22208c-366b-4618-87a8-a0b3492710e2" path="/var/lib/kubelet/pods/2d22208c-366b-4618-87a8-a0b3492710e2/volumes" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.177008 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1dab-account-create-h2p9r"] Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.179328 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1dab-account-create-h2p9r" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.189456 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.198886 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1dab-account-create-h2p9r"] Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.232018 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43550b7-9e60-4018-99bb-c1ef5c05b022","Type":"ContainerStarted","Data":"776a8745bcd7081f1e83fa71734cc8d1d2d908537ac54ab250c45ae4641a877c"} Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.238698 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cws8d\" (UniqueName: \"kubernetes.io/projected/8412255b-f9d0-4df8-a46a-b2e5f929322e-kube-api-access-cws8d\") pod \"nova-api-1dab-account-create-h2p9r\" (UID: \"8412255b-f9d0-4df8-a46a-b2e5f929322e\") " pod="openstack/nova-api-1dab-account-create-h2p9r" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.340304 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cws8d\" (UniqueName: \"kubernetes.io/projected/8412255b-f9d0-4df8-a46a-b2e5f929322e-kube-api-access-cws8d\") pod \"nova-api-1dab-account-create-h2p9r\" (UID: \"8412255b-f9d0-4df8-a46a-b2e5f929322e\") " pod="openstack/nova-api-1dab-account-create-h2p9r" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.368314 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cws8d\" (UniqueName: \"kubernetes.io/projected/8412255b-f9d0-4df8-a46a-b2e5f929322e-kube-api-access-cws8d\") pod \"nova-api-1dab-account-create-h2p9r\" (UID: \"8412255b-f9d0-4df8-a46a-b2e5f929322e\") " pod="openstack/nova-api-1dab-account-create-h2p9r" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.382698 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-066d-account-create-7ntjb"] Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.383941 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-066d-account-create-7ntjb" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.387466 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.399655 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-066d-account-create-7ntjb"] Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.443108 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm9ql\" (UniqueName: \"kubernetes.io/projected/565231b6-3cd5-4d24-bb02-114c04ef14f6-kube-api-access-cm9ql\") pod \"nova-cell0-066d-account-create-7ntjb\" (UID: \"565231b6-3cd5-4d24-bb02-114c04ef14f6\") " pod="openstack/nova-cell0-066d-account-create-7ntjb" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.506186 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1dab-account-create-h2p9r" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.544734 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm9ql\" (UniqueName: \"kubernetes.io/projected/565231b6-3cd5-4d24-bb02-114c04ef14f6-kube-api-access-cm9ql\") pod \"nova-cell0-066d-account-create-7ntjb\" (UID: \"565231b6-3cd5-4d24-bb02-114c04ef14f6\") " pod="openstack/nova-cell0-066d-account-create-7ntjb" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.563320 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm9ql\" (UniqueName: \"kubernetes.io/projected/565231b6-3cd5-4d24-bb02-114c04ef14f6-kube-api-access-cm9ql\") pod \"nova-cell0-066d-account-create-7ntjb\" (UID: \"565231b6-3cd5-4d24-bb02-114c04ef14f6\") " pod="openstack/nova-cell0-066d-account-create-7ntjb" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.586483 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-adee-account-create-vcvm9"] Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.588272 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-adee-account-create-vcvm9" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.591322 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.601107 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-adee-account-create-vcvm9"] Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.646413 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56hsf\" (UniqueName: \"kubernetes.io/projected/4940e5ae-c8a0-497e-884c-32b360630a9a-kube-api-access-56hsf\") pod \"nova-cell1-adee-account-create-vcvm9\" (UID: \"4940e5ae-c8a0-497e-884c-32b360630a9a\") " pod="openstack/nova-cell1-adee-account-create-vcvm9" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.738972 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-066d-account-create-7ntjb" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.747656 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56hsf\" (UniqueName: \"kubernetes.io/projected/4940e5ae-c8a0-497e-884c-32b360630a9a-kube-api-access-56hsf\") pod \"nova-cell1-adee-account-create-vcvm9\" (UID: \"4940e5ae-c8a0-497e-884c-32b360630a9a\") " pod="openstack/nova-cell1-adee-account-create-vcvm9" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.769991 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56hsf\" (UniqueName: \"kubernetes.io/projected/4940e5ae-c8a0-497e-884c-32b360630a9a-kube-api-access-56hsf\") pod \"nova-cell1-adee-account-create-vcvm9\" (UID: \"4940e5ae-c8a0-497e-884c-32b360630a9a\") " pod="openstack/nova-cell1-adee-account-create-vcvm9" Sep 30 17:22:46 crc kubenswrapper[4772]: I0930 17:22:46.933247 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-adee-account-create-vcvm9" Sep 30 17:22:47 crc kubenswrapper[4772]: I0930 17:22:47.035192 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1dab-account-create-h2p9r"] Sep 30 17:22:47 crc kubenswrapper[4772]: W0930 17:22:47.073234 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8412255b_f9d0_4df8_a46a_b2e5f929322e.slice/crio-ca049d614b55affc3c874509a8beb8e4ef7da1188d748872e90d8a3eac173527 WatchSource:0}: Error finding container ca049d614b55affc3c874509a8beb8e4ef7da1188d748872e90d8a3eac173527: Status 404 returned error can't find the container with id ca049d614b55affc3c874509a8beb8e4ef7da1188d748872e90d8a3eac173527 Sep 30 17:22:47 crc kubenswrapper[4772]: I0930 17:22:47.234633 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-066d-account-create-7ntjb"] Sep 30 17:22:47 crc kubenswrapper[4772]: I0930 17:22:47.249044 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1dab-account-create-h2p9r" event={"ID":"8412255b-f9d0-4df8-a46a-b2e5f929322e","Type":"ContainerStarted","Data":"ca049d614b55affc3c874509a8beb8e4ef7da1188d748872e90d8a3eac173527"} Sep 30 17:22:47 crc kubenswrapper[4772]: I0930 17:22:47.397889 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-adee-account-create-vcvm9"] Sep 30 17:22:47 crc kubenswrapper[4772]: W0930 17:22:47.400461 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4940e5ae_c8a0_497e_884c_32b360630a9a.slice/crio-3151fb1dab6c6b6ebd8726c8bb1443010034d69f61c61593df07c5e494ca3bec WatchSource:0}: Error finding container 3151fb1dab6c6b6ebd8726c8bb1443010034d69f61c61593df07c5e494ca3bec: Status 404 returned error can't find the container with id 3151fb1dab6c6b6ebd8726c8bb1443010034d69f61c61593df07c5e494ca3bec Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.260725 4772 generic.go:334] "Generic (PLEG): container finished" podID="4940e5ae-c8a0-497e-884c-32b360630a9a" containerID="d95e8415dcd8b91a41f27ba874da5277dab1216884ebc147e2dc2d3d2ba245ce" exitCode=0 Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.260784 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-adee-account-create-vcvm9" event={"ID":"4940e5ae-c8a0-497e-884c-32b360630a9a","Type":"ContainerDied","Data":"d95e8415dcd8b91a41f27ba874da5277dab1216884ebc147e2dc2d3d2ba245ce"} Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.261559 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-adee-account-create-vcvm9" event={"ID":"4940e5ae-c8a0-497e-884c-32b360630a9a","Type":"ContainerStarted","Data":"3151fb1dab6c6b6ebd8726c8bb1443010034d69f61c61593df07c5e494ca3bec"} Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.263137 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43550b7-9e60-4018-99bb-c1ef5c05b022","Type":"ContainerStarted","Data":"771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee"} Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.264599 4772 generic.go:334] "Generic (PLEG): container finished" podID="8412255b-f9d0-4df8-a46a-b2e5f929322e" containerID="24bcff46048882e0c870963a4248e25fcbedec9f5c29aaae4c662cd99ff2c760" exitCode=0 Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.264758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1dab-account-create-h2p9r" event={"ID":"8412255b-f9d0-4df8-a46a-b2e5f929322e","Type":"ContainerDied","Data":"24bcff46048882e0c870963a4248e25fcbedec9f5c29aaae4c662cd99ff2c760"} Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.268602 4772 generic.go:334] "Generic (PLEG): container finished" podID="565231b6-3cd5-4d24-bb02-114c04ef14f6" containerID="c475f9da79ee17b14ab01114b90aa45593cd79f2b039b5ab7ccbef79e13d3837" exitCode=0 Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.268741 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-066d-account-create-7ntjb" event={"ID":"565231b6-3cd5-4d24-bb02-114c04ef14f6","Type":"ContainerDied","Data":"c475f9da79ee17b14ab01114b90aa45593cd79f2b039b5ab7ccbef79e13d3837"} Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.268817 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-066d-account-create-7ntjb" event={"ID":"565231b6-3cd5-4d24-bb02-114c04ef14f6","Type":"ContainerStarted","Data":"5554bfbadb119dab7fa523f6f650e58be04cf9b998ec222464f9644e9e85841b"} Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.533135 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 17:22:48 crc kubenswrapper[4772]: I0930 17:22:48.533966 4772 scope.go:117] "RemoveContainer" containerID="68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58" Sep 30 17:22:48 crc kubenswrapper[4772]: E0930 17:22:48.534185 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(69f02322-0ff1-410e-8b46-dd3b5f909963)\"" pod="openstack/watcher-decision-engine-0" podUID="69f02322-0ff1-410e-8b46-dd3b5f909963" Sep 30 17:22:49 crc kubenswrapper[4772]: I0930 17:22:49.293351 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43550b7-9e60-4018-99bb-c1ef5c05b022","Type":"ContainerStarted","Data":"4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5"} Sep 30 17:22:49 crc kubenswrapper[4772]: I0930 17:22:49.818934 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-066d-account-create-7ntjb" Sep 30 17:22:49 crc kubenswrapper[4772]: I0930 17:22:49.923161 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm9ql\" (UniqueName: \"kubernetes.io/projected/565231b6-3cd5-4d24-bb02-114c04ef14f6-kube-api-access-cm9ql\") pod \"565231b6-3cd5-4d24-bb02-114c04ef14f6\" (UID: \"565231b6-3cd5-4d24-bb02-114c04ef14f6\") " Sep 30 17:22:49 crc kubenswrapper[4772]: I0930 17:22:49.932750 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565231b6-3cd5-4d24-bb02-114c04ef14f6-kube-api-access-cm9ql" (OuterVolumeSpecName: "kube-api-access-cm9ql") pod "565231b6-3cd5-4d24-bb02-114c04ef14f6" (UID: "565231b6-3cd5-4d24-bb02-114c04ef14f6"). InnerVolumeSpecName "kube-api-access-cm9ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:49 crc kubenswrapper[4772]: I0930 17:22:49.967460 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1dab-account-create-h2p9r" Sep 30 17:22:49 crc kubenswrapper[4772]: I0930 17:22:49.975741 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-adee-account-create-vcvm9" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.027195 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm9ql\" (UniqueName: \"kubernetes.io/projected/565231b6-3cd5-4d24-bb02-114c04ef14f6-kube-api-access-cm9ql\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.128406 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56hsf\" (UniqueName: \"kubernetes.io/projected/4940e5ae-c8a0-497e-884c-32b360630a9a-kube-api-access-56hsf\") pod \"4940e5ae-c8a0-497e-884c-32b360630a9a\" (UID: \"4940e5ae-c8a0-497e-884c-32b360630a9a\") " Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.128474 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cws8d\" (UniqueName: \"kubernetes.io/projected/8412255b-f9d0-4df8-a46a-b2e5f929322e-kube-api-access-cws8d\") pod \"8412255b-f9d0-4df8-a46a-b2e5f929322e\" (UID: \"8412255b-f9d0-4df8-a46a-b2e5f929322e\") " Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.132039 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4940e5ae-c8a0-497e-884c-32b360630a9a-kube-api-access-56hsf" (OuterVolumeSpecName: "kube-api-access-56hsf") pod "4940e5ae-c8a0-497e-884c-32b360630a9a" (UID: "4940e5ae-c8a0-497e-884c-32b360630a9a"). InnerVolumeSpecName "kube-api-access-56hsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.132691 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8412255b-f9d0-4df8-a46a-b2e5f929322e-kube-api-access-cws8d" (OuterVolumeSpecName: "kube-api-access-cws8d") pod "8412255b-f9d0-4df8-a46a-b2e5f929322e" (UID: "8412255b-f9d0-4df8-a46a-b2e5f929322e"). InnerVolumeSpecName "kube-api-access-cws8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.231680 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cws8d\" (UniqueName: \"kubernetes.io/projected/8412255b-f9d0-4df8-a46a-b2e5f929322e-kube-api-access-cws8d\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.231763 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56hsf\" (UniqueName: \"kubernetes.io/projected/4940e5ae-c8a0-497e-884c-32b360630a9a-kube-api-access-56hsf\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.305321 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-066d-account-create-7ntjb" event={"ID":"565231b6-3cd5-4d24-bb02-114c04ef14f6","Type":"ContainerDied","Data":"5554bfbadb119dab7fa523f6f650e58be04cf9b998ec222464f9644e9e85841b"} Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.305394 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5554bfbadb119dab7fa523f6f650e58be04cf9b998ec222464f9644e9e85841b" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.305564 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-066d-account-create-7ntjb" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.309880 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-adee-account-create-vcvm9" event={"ID":"4940e5ae-c8a0-497e-884c-32b360630a9a","Type":"ContainerDied","Data":"3151fb1dab6c6b6ebd8726c8bb1443010034d69f61c61593df07c5e494ca3bec"} Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.310028 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3151fb1dab6c6b6ebd8726c8bb1443010034d69f61c61593df07c5e494ca3bec" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.309941 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-adee-account-create-vcvm9" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.313674 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43550b7-9e60-4018-99bb-c1ef5c05b022","Type":"ContainerStarted","Data":"0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb"} Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.315767 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1dab-account-create-h2p9r" event={"ID":"8412255b-f9d0-4df8-a46a-b2e5f929322e","Type":"ContainerDied","Data":"ca049d614b55affc3c874509a8beb8e4ef7da1188d748872e90d8a3eac173527"} Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.315793 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca049d614b55affc3c874509a8beb8e4ef7da1188d748872e90d8a3eac173527" Sep 30 17:22:50 crc kubenswrapper[4772]: I0930 17:22:50.315941 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1dab-account-create-h2p9r" Sep 30 17:22:50 crc kubenswrapper[4772]: E0930 17:22:50.531610 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice/crio-c3770539902a16e74e3e39560dedfd4186d992d94329372061781b7b9d728d3f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice/crio-1dfd57ce816d2bbdf2ae36000221a541c36ab9ea26978c490d743f47a1903fef\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4940e5ae_c8a0_497e_884c_32b360630a9a.slice\": RecentStats: unable to find data in memory cache]" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.748955 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-grhrs"] Sep 30 17:22:51 crc kubenswrapper[4772]: E0930 17:22:51.750026 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8412255b-f9d0-4df8-a46a-b2e5f929322e" containerName="mariadb-account-create" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.750046 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8412255b-f9d0-4df8-a46a-b2e5f929322e" containerName="mariadb-account-create" Sep 30 17:22:51 crc kubenswrapper[4772]: E0930 17:22:51.750094 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565231b6-3cd5-4d24-bb02-114c04ef14f6" containerName="mariadb-account-create" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.750106 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="565231b6-3cd5-4d24-bb02-114c04ef14f6" containerName="mariadb-account-create" Sep 30 17:22:51 crc kubenswrapper[4772]: E0930 17:22:51.750150 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4940e5ae-c8a0-497e-884c-32b360630a9a" containerName="mariadb-account-create" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.750156 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4940e5ae-c8a0-497e-884c-32b360630a9a" containerName="mariadb-account-create" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.750352 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8412255b-f9d0-4df8-a46a-b2e5f929322e" containerName="mariadb-account-create" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.750364 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4940e5ae-c8a0-497e-884c-32b360630a9a" containerName="mariadb-account-create" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.750379 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="565231b6-3cd5-4d24-bb02-114c04ef14f6" containerName="mariadb-account-create" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.751232 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.753489 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5hmf5" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.753617 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.754159 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.773566 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54t5q\" (UniqueName: \"kubernetes.io/projected/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-kube-api-access-54t5q\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.773642 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-config-data\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.773710 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-scripts\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.773856 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.785476 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-grhrs"] Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.876610 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.876769 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54t5q\" (UniqueName: \"kubernetes.io/projected/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-kube-api-access-54t5q\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.876814 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-config-data\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.876843 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-scripts\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.881689 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-scripts\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.881747 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.887133 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-config-data\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:51 crc kubenswrapper[4772]: I0930 17:22:51.894770 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54t5q\" (UniqueName: \"kubernetes.io/projected/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-kube-api-access-54t5q\") pod \"nova-cell0-conductor-db-sync-grhrs\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:52 crc kubenswrapper[4772]: I0930 17:22:52.069809 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:22:52 crc kubenswrapper[4772]: I0930 17:22:52.338027 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43550b7-9e60-4018-99bb-c1ef5c05b022","Type":"ContainerStarted","Data":"b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0"} Sep 30 17:22:52 crc kubenswrapper[4772]: I0930 17:22:52.339346 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 17:22:52 crc kubenswrapper[4772]: I0930 17:22:52.369125 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2244904070000002 podStartE2EDuration="8.369101946s" podCreationTimestamp="2025-09-30 17:22:44 +0000 UTC" firstStartedPulling="2025-09-30 17:22:45.308776607 +0000 UTC m=+1266.215789438" lastFinishedPulling="2025-09-30 17:22:51.453388146 +0000 UTC m=+1272.360400977" observedRunningTime="2025-09-30 17:22:52.365239626 +0000 UTC m=+1273.272252467" watchObservedRunningTime="2025-09-30 17:22:52.369101946 +0000 UTC m=+1273.276114807" Sep 30 17:22:52 crc kubenswrapper[4772]: I0930 17:22:52.610340 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-grhrs"] Sep 30 17:22:53 crc kubenswrapper[4772]: I0930 17:22:53.356101 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-grhrs" event={"ID":"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7","Type":"ContainerStarted","Data":"4d58aee22630040ea9002c078d1b676f146249db9c9d9aabadb970638965f5a2"} Sep 30 17:23:00 crc kubenswrapper[4772]: E0930 17:23:00.799540 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice/crio-1dfd57ce816d2bbdf2ae36000221a541c36ab9ea26978c490d743f47a1903fef\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice/crio-c3770539902a16e74e3e39560dedfd4186d992d94329372061781b7b9d728d3f\": RecentStats: unable to find data in memory cache]" Sep 30 17:23:00 crc kubenswrapper[4772]: I0930 17:23:00.898023 4772 scope.go:117] "RemoveContainer" containerID="68f436a0b396e40665edf7ccd1d1ea86856068f26633dcde297211f7c7677e58" Sep 30 17:23:07 crc kubenswrapper[4772]: E0930 17:23:07.323093 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest" Sep 30 17:23:07 crc kubenswrapper[4772]: E0930 17:23:07.323978 4772 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.221:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest" Sep 30 17:23:07 crc kubenswrapper[4772]: E0930 17:23:07.324164 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:38.129.56.221:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54t5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-grhrs_openstack(9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:23:07 crc kubenswrapper[4772]: E0930 17:23:07.325369 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-grhrs" podUID="9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" Sep 30 17:23:07 crc kubenswrapper[4772]: I0930 17:23:07.510730 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"69f02322-0ff1-410e-8b46-dd3b5f909963","Type":"ContainerStarted","Data":"77aa42ec719aee365825d2f31c46965e09cc35bdb6e5aa066385db1258836bce"} Sep 30 17:23:07 crc kubenswrapper[4772]: E0930 17:23:07.512011 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.221:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-grhrs" podUID="9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" Sep 30 17:23:08 crc kubenswrapper[4772]: I0930 17:23:08.533416 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 17:23:08 crc kubenswrapper[4772]: I0930 17:23:08.570238 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 17:23:09 crc kubenswrapper[4772]: I0930 17:23:09.545809 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 17:23:09 crc kubenswrapper[4772]: I0930 17:23:09.586018 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 17:23:11 crc kubenswrapper[4772]: E0930 17:23:11.042334 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice/crio-c3770539902a16e74e3e39560dedfd4186d992d94329372061781b7b9d728d3f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice/crio-1dfd57ce816d2bbdf2ae36000221a541c36ab9ea26978c490d743f47a1903fef\": RecentStats: unable to find data in memory cache]" Sep 30 17:23:14 crc kubenswrapper[4772]: I0930 17:23:14.786112 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 17:23:21 crc kubenswrapper[4772]: E0930 17:23:21.288871 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice/crio-c3770539902a16e74e3e39560dedfd4186d992d94329372061781b7b9d728d3f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice/crio-1dfd57ce816d2bbdf2ae36000221a541c36ab9ea26978c490d743f47a1903fef\": RecentStats: unable to find data in memory cache]" Sep 30 17:23:21 crc kubenswrapper[4772]: I0930 17:23:21.662516 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-grhrs" event={"ID":"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7","Type":"ContainerStarted","Data":"79b64bea6aaa5c47f6b2caac074731cf66c3ed741930fc4e98fa8683a63f6bae"} Sep 30 17:23:21 crc kubenswrapper[4772]: I0930 17:23:21.684575 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-grhrs" podStartSLOduration=2.205442489 podStartE2EDuration="30.684552391s" podCreationTimestamp="2025-09-30 17:22:51 +0000 UTC" firstStartedPulling="2025-09-30 17:22:52.652543853 +0000 UTC m=+1273.559556684" lastFinishedPulling="2025-09-30 17:23:21.131653755 +0000 UTC m=+1302.038666586" observedRunningTime="2025-09-30 17:23:21.678661876 +0000 UTC m=+1302.585674727" watchObservedRunningTime="2025-09-30 17:23:21.684552391 +0000 UTC m=+1302.591565232" Sep 30 17:23:31 crc kubenswrapper[4772]: E0930 17:23:31.532329 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ffc0b4_4f98_45c5_b395_ee9defa7f57d.slice/crio-c3770539902a16e74e3e39560dedfd4186d992d94329372061781b7b9d728d3f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4018ba50_6562_42a4_ba6a_70d499df4c43.slice/crio-1dfd57ce816d2bbdf2ae36000221a541c36ab9ea26978c490d743f47a1903fef\": RecentStats: unable to find data in memory cache]" Sep 30 17:24:08 crc kubenswrapper[4772]: I0930 17:24:08.655571 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:24:08 crc kubenswrapper[4772]: I0930 17:24:08.656656 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:24:17 crc kubenswrapper[4772]: I0930 17:24:17.241760 4772 generic.go:334] "Generic (PLEG): container finished" podID="9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" containerID="79b64bea6aaa5c47f6b2caac074731cf66c3ed741930fc4e98fa8683a63f6bae" exitCode=0 Sep 30 17:24:17 crc kubenswrapper[4772]: I0930 17:24:17.241843 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-grhrs" event={"ID":"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7","Type":"ContainerDied","Data":"79b64bea6aaa5c47f6b2caac074731cf66c3ed741930fc4e98fa8683a63f6bae"} Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.611446 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.762969 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-combined-ca-bundle\") pod \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.763089 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54t5q\" (UniqueName: \"kubernetes.io/projected/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-kube-api-access-54t5q\") pod \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.763137 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-config-data\") pod \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.763176 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-scripts\") pod \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\" (UID: \"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7\") " Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.768661 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-scripts" (OuterVolumeSpecName: "scripts") pod "9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" (UID: "9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.769538 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-kube-api-access-54t5q" (OuterVolumeSpecName: "kube-api-access-54t5q") pod "9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" (UID: "9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7"). InnerVolumeSpecName "kube-api-access-54t5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.793917 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" (UID: "9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.795243 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-config-data" (OuterVolumeSpecName: "config-data") pod "9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" (UID: "9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.865847 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.865885 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54t5q\" (UniqueName: \"kubernetes.io/projected/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-kube-api-access-54t5q\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.865896 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:18 crc kubenswrapper[4772]: I0930 17:24:18.865904 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.259657 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-grhrs" event={"ID":"9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7","Type":"ContainerDied","Data":"4d58aee22630040ea9002c078d1b676f146249db9c9d9aabadb970638965f5a2"} Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.259952 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d58aee22630040ea9002c078d1b676f146249db9c9d9aabadb970638965f5a2" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.259747 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-grhrs" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.364566 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 17:24:19 crc kubenswrapper[4772]: E0930 17:24:19.365042 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" containerName="nova-cell0-conductor-db-sync" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.365145 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" containerName="nova-cell0-conductor-db-sync" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.365344 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" containerName="nova-cell0-conductor-db-sync" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.366026 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.370159 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5hmf5" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.370390 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.387197 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.493552 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrbxx\" (UniqueName: \"kubernetes.io/projected/a242b41e-98a7-4814-984c-70b36be61cb9-kube-api-access-hrbxx\") pod \"nova-cell0-conductor-0\" (UID: \"a242b41e-98a7-4814-984c-70b36be61cb9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.493747 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a242b41e-98a7-4814-984c-70b36be61cb9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a242b41e-98a7-4814-984c-70b36be61cb9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.493901 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a242b41e-98a7-4814-984c-70b36be61cb9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a242b41e-98a7-4814-984c-70b36be61cb9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.596090 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a242b41e-98a7-4814-984c-70b36be61cb9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a242b41e-98a7-4814-984c-70b36be61cb9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.596262 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a242b41e-98a7-4814-984c-70b36be61cb9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a242b41e-98a7-4814-984c-70b36be61cb9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.596340 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrbxx\" (UniqueName: \"kubernetes.io/projected/a242b41e-98a7-4814-984c-70b36be61cb9-kube-api-access-hrbxx\") pod \"nova-cell0-conductor-0\" (UID: \"a242b41e-98a7-4814-984c-70b36be61cb9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.601371 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a242b41e-98a7-4814-984c-70b36be61cb9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a242b41e-98a7-4814-984c-70b36be61cb9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.603011 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a242b41e-98a7-4814-984c-70b36be61cb9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a242b41e-98a7-4814-984c-70b36be61cb9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.618162 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrbxx\" (UniqueName: \"kubernetes.io/projected/a242b41e-98a7-4814-984c-70b36be61cb9-kube-api-access-hrbxx\") pod \"nova-cell0-conductor-0\" (UID: \"a242b41e-98a7-4814-984c-70b36be61cb9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:19 crc kubenswrapper[4772]: I0930 17:24:19.693327 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:20 crc kubenswrapper[4772]: I0930 17:24:20.119031 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 17:24:20 crc kubenswrapper[4772]: I0930 17:24:20.271363 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a242b41e-98a7-4814-984c-70b36be61cb9","Type":"ContainerStarted","Data":"1763ef260219c5c76223e4c750649393086db17287b2f511f18f73a46e816103"} Sep 30 17:24:21 crc kubenswrapper[4772]: I0930 17:24:21.282335 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a242b41e-98a7-4814-984c-70b36be61cb9","Type":"ContainerStarted","Data":"8e6710102170dce3e7dead265a2d173ecb9a5e951e7aad072d4c470aa70ae29c"} Sep 30 17:24:21 crc kubenswrapper[4772]: I0930 17:24:21.282622 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:21 crc kubenswrapper[4772]: I0930 17:24:21.308957 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.308930818 podStartE2EDuration="2.308930818s" podCreationTimestamp="2025-09-30 17:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:24:21.302088468 +0000 UTC m=+1362.209101309" watchObservedRunningTime="2025-09-30 17:24:21.308930818 +0000 UTC m=+1362.215943649" Sep 30 17:24:29 crc kubenswrapper[4772]: I0930 17:24:29.730655 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.204772 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lnbhb"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.206526 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.210621 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.210918 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.223334 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnbhb"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.309531 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.309587 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-scripts\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.309791 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-config-data\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.309915 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f46n4\" (UniqueName: \"kubernetes.io/projected/333bd9e9-4bac-49af-9d96-25c2c03cb96a-kube-api-access-f46n4\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.402760 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.411442 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f46n4\" (UniqueName: \"kubernetes.io/projected/333bd9e9-4bac-49af-9d96-25c2c03cb96a-kube-api-access-f46n4\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.411588 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.411609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-scripts\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.411652 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-config-data\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.412958 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.416548 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.417821 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.422985 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.424866 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-config-data\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.434001 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-scripts\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.455540 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.457161 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f46n4\" (UniqueName: \"kubernetes.io/projected/333bd9e9-4bac-49af-9d96-25c2c03cb96a-kube-api-access-f46n4\") pod \"nova-cell0-cell-mapping-lnbhb\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.463381 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.469972 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.495861 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.523494 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq2w4\" (UniqueName: \"kubernetes.io/projected/208f2c4c-b208-4208-b931-71b68e7e7d39-kube-api-access-rq2w4\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.523608 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.523859 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/208f2c4c-b208-4208-b931-71b68e7e7d39-logs\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.523893 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-config-data\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.528829 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.579191 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-957558b67-rfgbf"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.580963 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.603160 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.604791 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.609271 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.625250 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.625316 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/208f2c4c-b208-4208-b931-71b68e7e7d39-logs\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.625352 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-config-data\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.626749 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/208f2c4c-b208-4208-b931-71b68e7e7d39-logs\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.627898 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-config-data\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.627951 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jctnb\" (UniqueName: \"kubernetes.io/projected/5606ec35-7419-4109-ab7d-20cf4b1d4562-kube-api-access-jctnb\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.628070 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq2w4\" (UniqueName: \"kubernetes.io/projected/208f2c4c-b208-4208-b931-71b68e7e7d39-kube-api-access-rq2w4\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.628138 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5606ec35-7419-4109-ab7d-20cf4b1d4562-logs\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.628218 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.640263 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-config-data\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.640378 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-957558b67-rfgbf"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.648833 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.657678 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.675544 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.677177 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.705469 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.706153 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq2w4\" (UniqueName: \"kubernetes.io/projected/208f2c4c-b208-4208-b931-71b68e7e7d39-kube-api-access-rq2w4\") pod \"nova-metadata-0\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.706203 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730394 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730452 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-dns-svc\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730486 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-config-data\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730507 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jctnb\" (UniqueName: \"kubernetes.io/projected/5606ec35-7419-4109-ab7d-20cf4b1d4562-kube-api-access-jctnb\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730582 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-sb\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730616 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5606ec35-7419-4109-ab7d-20cf4b1d4562-logs\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730688 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730709 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-config\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730784 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-nb\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730810 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n86q\" (UniqueName: \"kubernetes.io/projected/0a742f77-0412-4089-85e8-f78cfef69aff-kube-api-access-8n86q\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730832 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsdnl\" (UniqueName: \"kubernetes.io/projected/9df50dc7-a9cd-4936-8f5b-c469a78679ca-kube-api-access-hsdnl\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.730868 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.731538 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5606ec35-7419-4109-ab7d-20cf4b1d4562-logs\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.736753 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.757364 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-config-data\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.805391 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jctnb\" (UniqueName: \"kubernetes.io/projected/5606ec35-7419-4109-ab7d-20cf4b1d4562-kube-api-access-jctnb\") pod \"nova-api-0\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841211 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-nb\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841261 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n86q\" (UniqueName: \"kubernetes.io/projected/0a742f77-0412-4089-85e8-f78cfef69aff-kube-api-access-8n86q\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841289 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsdnl\" (UniqueName: \"kubernetes.io/projected/9df50dc7-a9cd-4936-8f5b-c469a78679ca-kube-api-access-hsdnl\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841339 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841377 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-config-data\") pod \"nova-scheduler-0\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841419 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-dns-svc\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841507 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7jtw\" (UniqueName: \"kubernetes.io/projected/bd69046b-895b-43d3-99a2-15d6f1edcfa4-kube-api-access-z7jtw\") pod \"nova-scheduler-0\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841549 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841602 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-sb\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841731 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.841756 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-config\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.844571 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-config\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.845375 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-nb\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.847639 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-dns-svc\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.848898 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-sb\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.854321 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.856786 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.867720 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.873577 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsdnl\" (UniqueName: \"kubernetes.io/projected/9df50dc7-a9cd-4936-8f5b-c469a78679ca-kube-api-access-hsdnl\") pod \"dnsmasq-dns-957558b67-rfgbf\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.876039 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n86q\" (UniqueName: \"kubernetes.io/projected/0a742f77-0412-4089-85e8-f78cfef69aff-kube-api-access-8n86q\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.884567 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.948662 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-config-data\") pod \"nova-scheduler-0\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.948771 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7jtw\" (UniqueName: \"kubernetes.io/projected/bd69046b-895b-43d3-99a2-15d6f1edcfa4-kube-api-access-z7jtw\") pod \"nova-scheduler-0\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.948814 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.954704 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.958165 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-config-data\") pod \"nova-scheduler-0\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:30 crc kubenswrapper[4772]: I0930 17:24:30.972123 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7jtw\" (UniqueName: \"kubernetes.io/projected/bd69046b-895b-43d3-99a2-15d6f1edcfa4-kube-api-access-z7jtw\") pod \"nova-scheduler-0\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.110754 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.128471 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.164775 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.215244 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnbhb"] Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.389861 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnbhb" event={"ID":"333bd9e9-4bac-49af-9d96-25c2c03cb96a","Type":"ContainerStarted","Data":"cd1fa5a046bcbc9bfdf4392f3c7c39657483898f416fa3e2af7a59cfcc775933"} Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.476414 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:24:31 crc kubenswrapper[4772]: W0930 17:24:31.483464 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5606ec35_7419_4109_ab7d_20cf4b1d4562.slice/crio-a18bc124fc90795151635d873570d88cb3fe4819f9156919aaee1ea0ccf13c4b WatchSource:0}: Error finding container a18bc124fc90795151635d873570d88cb3fe4819f9156919aaee1ea0ccf13c4b: Status 404 returned error can't find the container with id a18bc124fc90795151635d873570d88cb3fe4819f9156919aaee1ea0ccf13c4b Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.504761 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7qklz"] Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.507026 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.517083 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.517239 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.526579 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7qklz"] Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.604412 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:31 crc kubenswrapper[4772]: W0930 17:24:31.610457 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod208f2c4c_b208_4208_b931_71b68e7e7d39.slice/crio-f5a2d5b454bd2620c04a39a85a102ee8c2ba07a9032d5d296ea3270ca0deb10e WatchSource:0}: Error finding container f5a2d5b454bd2620c04a39a85a102ee8c2ba07a9032d5d296ea3270ca0deb10e: Status 404 returned error can't find the container with id f5a2d5b454bd2620c04a39a85a102ee8c2ba07a9032d5d296ea3270ca0deb10e Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.669009 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-scripts\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.669114 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwrgs\" (UniqueName: \"kubernetes.io/projected/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-kube-api-access-gwrgs\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.669159 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-config-data\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.669179 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.770869 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-scripts\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.770996 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwrgs\" (UniqueName: \"kubernetes.io/projected/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-kube-api-access-gwrgs\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.771075 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-config-data\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.771103 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.777944 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-scripts\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.777959 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-config-data\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.778980 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.789255 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwrgs\" (UniqueName: \"kubernetes.io/projected/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-kube-api-access-gwrgs\") pod \"nova-cell1-conductor-db-sync-7qklz\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.874148 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-957558b67-rfgbf"] Sep 30 17:24:31 crc kubenswrapper[4772]: I0930 17:24:31.893354 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.061418 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.077372 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:24:32 crc kubenswrapper[4772]: W0930 17:24:32.084011 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a742f77_0412_4089_85e8_f78cfef69aff.slice/crio-1f343d88ab35403509a3bff26547009b52ae2acb583f8963bd81ec69ade37890 WatchSource:0}: Error finding container 1f343d88ab35403509a3bff26547009b52ae2acb583f8963bd81ec69ade37890: Status 404 returned error can't find the container with id 1f343d88ab35403509a3bff26547009b52ae2acb583f8963bd81ec69ade37890 Sep 30 17:24:32 crc kubenswrapper[4772]: W0930 17:24:32.110582 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd69046b_895b_43d3_99a2_15d6f1edcfa4.slice/crio-e6daad2a1e15e2bb01be5b512446dc8229b6f9012da0427727b4a0ed5ca3a2f5 WatchSource:0}: Error finding container e6daad2a1e15e2bb01be5b512446dc8229b6f9012da0427727b4a0ed5ca3a2f5: Status 404 returned error can't find the container with id e6daad2a1e15e2bb01be5b512446dc8229b6f9012da0427727b4a0ed5ca3a2f5 Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.425168 4772 generic.go:334] "Generic (PLEG): container finished" podID="9df50dc7-a9cd-4936-8f5b-c469a78679ca" containerID="5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd" exitCode=0 Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.425357 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-957558b67-rfgbf" event={"ID":"9df50dc7-a9cd-4936-8f5b-c469a78679ca","Type":"ContainerDied","Data":"5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd"} Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.425623 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-957558b67-rfgbf" event={"ID":"9df50dc7-a9cd-4936-8f5b-c469a78679ca","Type":"ContainerStarted","Data":"b0de630b3002320f367963a0ee5f8d2e33d1afb803b649c1b9af9a823adc1aea"} Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.429264 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5606ec35-7419-4109-ab7d-20cf4b1d4562","Type":"ContainerStarted","Data":"a18bc124fc90795151635d873570d88cb3fe4819f9156919aaee1ea0ccf13c4b"} Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.438891 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a742f77-0412-4089-85e8-f78cfef69aff","Type":"ContainerStarted","Data":"1f343d88ab35403509a3bff26547009b52ae2acb583f8963bd81ec69ade37890"} Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.463552 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnbhb" event={"ID":"333bd9e9-4bac-49af-9d96-25c2c03cb96a","Type":"ContainerStarted","Data":"130bb11040581823309c0a46d0fca21013d2a85f029bb5f10c13a83101632746"} Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.484850 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7qklz"] Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.485164 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd69046b-895b-43d3-99a2-15d6f1edcfa4","Type":"ContainerStarted","Data":"e6daad2a1e15e2bb01be5b512446dc8229b6f9012da0427727b4a0ed5ca3a2f5"} Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.487557 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"208f2c4c-b208-4208-b931-71b68e7e7d39","Type":"ContainerStarted","Data":"f5a2d5b454bd2620c04a39a85a102ee8c2ba07a9032d5d296ea3270ca0deb10e"} Sep 30 17:24:32 crc kubenswrapper[4772]: I0930 17:24:32.501929 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lnbhb" podStartSLOduration=2.501911563 podStartE2EDuration="2.501911563s" podCreationTimestamp="2025-09-30 17:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:24:32.494853878 +0000 UTC m=+1373.401866719" watchObservedRunningTime="2025-09-30 17:24:32.501911563 +0000 UTC m=+1373.408924394" Sep 30 17:24:33 crc kubenswrapper[4772]: I0930 17:24:33.508707 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7qklz" event={"ID":"7f7d470d-5fe9-4d90-a24b-705f8af5d35d","Type":"ContainerStarted","Data":"6c5c0f407021dbbf16dbddb948491bd9cc0c44ef6d19cee05096b9e686ad98c6"} Sep 30 17:24:33 crc kubenswrapper[4772]: I0930 17:24:33.509372 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7qklz" event={"ID":"7f7d470d-5fe9-4d90-a24b-705f8af5d35d","Type":"ContainerStarted","Data":"bc14500b96525c8fe3652206cfbff60566980213a8c5530de0f2689cb6ac968e"} Sep 30 17:24:33 crc kubenswrapper[4772]: I0930 17:24:33.513731 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-957558b67-rfgbf" event={"ID":"9df50dc7-a9cd-4936-8f5b-c469a78679ca","Type":"ContainerStarted","Data":"3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b"} Sep 30 17:24:33 crc kubenswrapper[4772]: I0930 17:24:33.514196 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:33 crc kubenswrapper[4772]: I0930 17:24:33.534397 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7qklz" podStartSLOduration=2.53435373 podStartE2EDuration="2.53435373s" podCreationTimestamp="2025-09-30 17:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:24:33.52524373 +0000 UTC m=+1374.432256571" watchObservedRunningTime="2025-09-30 17:24:33.53435373 +0000 UTC m=+1374.441366561" Sep 30 17:24:33 crc kubenswrapper[4772]: I0930 17:24:33.557943 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-957558b67-rfgbf" podStartSLOduration=3.557918158 podStartE2EDuration="3.557918158s" podCreationTimestamp="2025-09-30 17:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:24:33.547543486 +0000 UTC m=+1374.454556317" watchObservedRunningTime="2025-09-30 17:24:33.557918158 +0000 UTC m=+1374.464930999" Sep 30 17:24:34 crc kubenswrapper[4772]: I0930 17:24:34.381940 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:24:34 crc kubenswrapper[4772]: I0930 17:24:34.404424 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:36 crc kubenswrapper[4772]: I0930 17:24:36.540928 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5606ec35-7419-4109-ab7d-20cf4b1d4562","Type":"ContainerStarted","Data":"4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b"} Sep 30 17:24:36 crc kubenswrapper[4772]: I0930 17:24:36.544629 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0a742f77-0412-4089-85e8-f78cfef69aff" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3" gracePeriod=30 Sep 30 17:24:36 crc kubenswrapper[4772]: I0930 17:24:36.544913 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a742f77-0412-4089-85e8-f78cfef69aff","Type":"ContainerStarted","Data":"56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3"} Sep 30 17:24:36 crc kubenswrapper[4772]: I0930 17:24:36.549630 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd69046b-895b-43d3-99a2-15d6f1edcfa4","Type":"ContainerStarted","Data":"befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd"} Sep 30 17:24:36 crc kubenswrapper[4772]: I0930 17:24:36.551579 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"208f2c4c-b208-4208-b931-71b68e7e7d39","Type":"ContainerStarted","Data":"da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30"} Sep 30 17:24:36 crc kubenswrapper[4772]: I0930 17:24:36.566320 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.606456658 podStartE2EDuration="6.566300883s" podCreationTimestamp="2025-09-30 17:24:30 +0000 UTC" firstStartedPulling="2025-09-30 17:24:32.09235658 +0000 UTC m=+1372.999369411" lastFinishedPulling="2025-09-30 17:24:36.052200805 +0000 UTC m=+1376.959213636" observedRunningTime="2025-09-30 17:24:36.561500407 +0000 UTC m=+1377.468513238" watchObservedRunningTime="2025-09-30 17:24:36.566300883 +0000 UTC m=+1377.473313714" Sep 30 17:24:36 crc kubenswrapper[4772]: I0930 17:24:36.600913 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.66406107 podStartE2EDuration="6.600892391s" podCreationTimestamp="2025-09-30 17:24:30 +0000 UTC" firstStartedPulling="2025-09-30 17:24:32.110936948 +0000 UTC m=+1373.017949779" lastFinishedPulling="2025-09-30 17:24:36.047768249 +0000 UTC m=+1376.954781100" observedRunningTime="2025-09-30 17:24:36.592896521 +0000 UTC m=+1377.499909362" watchObservedRunningTime="2025-09-30 17:24:36.600892391 +0000 UTC m=+1377.507905222" Sep 30 17:24:37 crc kubenswrapper[4772]: I0930 17:24:37.564758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"208f2c4c-b208-4208-b931-71b68e7e7d39","Type":"ContainerStarted","Data":"0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b"} Sep 30 17:24:37 crc kubenswrapper[4772]: I0930 17:24:37.564948 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="208f2c4c-b208-4208-b931-71b68e7e7d39" containerName="nova-metadata-log" containerID="cri-o://da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30" gracePeriod=30 Sep 30 17:24:37 crc kubenswrapper[4772]: I0930 17:24:37.564973 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="208f2c4c-b208-4208-b931-71b68e7e7d39" containerName="nova-metadata-metadata" containerID="cri-o://0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b" gracePeriod=30 Sep 30 17:24:37 crc kubenswrapper[4772]: I0930 17:24:37.578019 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5606ec35-7419-4109-ab7d-20cf4b1d4562","Type":"ContainerStarted","Data":"cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828"} Sep 30 17:24:37 crc kubenswrapper[4772]: I0930 17:24:37.600260 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.167050098 podStartE2EDuration="7.600238819s" podCreationTimestamp="2025-09-30 17:24:30 +0000 UTC" firstStartedPulling="2025-09-30 17:24:31.614121476 +0000 UTC m=+1372.521134307" lastFinishedPulling="2025-09-30 17:24:36.047310197 +0000 UTC m=+1376.954323028" observedRunningTime="2025-09-30 17:24:37.599318205 +0000 UTC m=+1378.506331036" watchObservedRunningTime="2025-09-30 17:24:37.600238819 +0000 UTC m=+1378.507251650" Sep 30 17:24:37 crc kubenswrapper[4772]: I0930 17:24:37.627646 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.0757118 podStartE2EDuration="7.627621708s" podCreationTimestamp="2025-09-30 17:24:30 +0000 UTC" firstStartedPulling="2025-09-30 17:24:31.495418849 +0000 UTC m=+1372.402431680" lastFinishedPulling="2025-09-30 17:24:36.047328757 +0000 UTC m=+1376.954341588" observedRunningTime="2025-09-30 17:24:37.622327909 +0000 UTC m=+1378.529340740" watchObservedRunningTime="2025-09-30 17:24:37.627621708 +0000 UTC m=+1378.534634539" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.149489 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.324486 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq2w4\" (UniqueName: \"kubernetes.io/projected/208f2c4c-b208-4208-b931-71b68e7e7d39-kube-api-access-rq2w4\") pod \"208f2c4c-b208-4208-b931-71b68e7e7d39\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.324975 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-config-data\") pod \"208f2c4c-b208-4208-b931-71b68e7e7d39\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.325044 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/208f2c4c-b208-4208-b931-71b68e7e7d39-logs\") pod \"208f2c4c-b208-4208-b931-71b68e7e7d39\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.325277 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-combined-ca-bundle\") pod \"208f2c4c-b208-4208-b931-71b68e7e7d39\" (UID: \"208f2c4c-b208-4208-b931-71b68e7e7d39\") " Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.325859 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/208f2c4c-b208-4208-b931-71b68e7e7d39-logs" (OuterVolumeSpecName: "logs") pod "208f2c4c-b208-4208-b931-71b68e7e7d39" (UID: "208f2c4c-b208-4208-b931-71b68e7e7d39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.331848 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/208f2c4c-b208-4208-b931-71b68e7e7d39-kube-api-access-rq2w4" (OuterVolumeSpecName: "kube-api-access-rq2w4") pod "208f2c4c-b208-4208-b931-71b68e7e7d39" (UID: "208f2c4c-b208-4208-b931-71b68e7e7d39"). InnerVolumeSpecName "kube-api-access-rq2w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.356642 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "208f2c4c-b208-4208-b931-71b68e7e7d39" (UID: "208f2c4c-b208-4208-b931-71b68e7e7d39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.359438 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-config-data" (OuterVolumeSpecName: "config-data") pod "208f2c4c-b208-4208-b931-71b68e7e7d39" (UID: "208f2c4c-b208-4208-b931-71b68e7e7d39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.427855 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq2w4\" (UniqueName: \"kubernetes.io/projected/208f2c4c-b208-4208-b931-71b68e7e7d39-kube-api-access-rq2w4\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.428170 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.428266 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/208f2c4c-b208-4208-b931-71b68e7e7d39-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.428321 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208f2c4c-b208-4208-b931-71b68e7e7d39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.588933 4772 generic.go:334] "Generic (PLEG): container finished" podID="208f2c4c-b208-4208-b931-71b68e7e7d39" containerID="0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b" exitCode=0 Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.588967 4772 generic.go:334] "Generic (PLEG): container finished" podID="208f2c4c-b208-4208-b931-71b68e7e7d39" containerID="da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30" exitCode=143 Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.588999 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.589095 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"208f2c4c-b208-4208-b931-71b68e7e7d39","Type":"ContainerDied","Data":"0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b"} Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.589124 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"208f2c4c-b208-4208-b931-71b68e7e7d39","Type":"ContainerDied","Data":"da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30"} Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.589155 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"208f2c4c-b208-4208-b931-71b68e7e7d39","Type":"ContainerDied","Data":"f5a2d5b454bd2620c04a39a85a102ee8c2ba07a9032d5d296ea3270ca0deb10e"} Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.589171 4772 scope.go:117] "RemoveContainer" containerID="0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.621452 4772 scope.go:117] "RemoveContainer" containerID="da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.630026 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.642161 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.645446 4772 scope.go:117] "RemoveContainer" containerID="0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b" Sep 30 17:24:38 crc kubenswrapper[4772]: E0930 17:24:38.645963 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b\": container with ID starting with 0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b not found: ID does not exist" containerID="0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.646008 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b"} err="failed to get container status \"0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b\": rpc error: code = NotFound desc = could not find container \"0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b\": container with ID starting with 0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b not found: ID does not exist" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.646038 4772 scope.go:117] "RemoveContainer" containerID="da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30" Sep 30 17:24:38 crc kubenswrapper[4772]: E0930 17:24:38.646299 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30\": container with ID starting with da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30 not found: ID does not exist" containerID="da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.646325 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30"} err="failed to get container status \"da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30\": rpc error: code = NotFound desc = could not find container \"da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30\": container with ID starting with da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30 not found: ID does not exist" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.646340 4772 scope.go:117] "RemoveContainer" containerID="0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.646627 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b"} err="failed to get container status \"0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b\": rpc error: code = NotFound desc = could not find container \"0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b\": container with ID starting with 0404cabb081ea65a82a6f57b811da0790bbeb50641f35859db21ae05c281721b not found: ID does not exist" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.646671 4772 scope.go:117] "RemoveContainer" containerID="da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.646894 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30"} err="failed to get container status \"da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30\": rpc error: code = NotFound desc = could not find container \"da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30\": container with ID starting with da22de0e539ea23f664e622313d580b30458466c45b7513af48bcde09961ce30 not found: ID does not exist" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.663278 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:38 crc kubenswrapper[4772]: E0930 17:24:38.664107 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208f2c4c-b208-4208-b931-71b68e7e7d39" containerName="nova-metadata-log" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.664207 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="208f2c4c-b208-4208-b931-71b68e7e7d39" containerName="nova-metadata-log" Sep 30 17:24:38 crc kubenswrapper[4772]: E0930 17:24:38.664264 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208f2c4c-b208-4208-b931-71b68e7e7d39" containerName="nova-metadata-metadata" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.664323 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="208f2c4c-b208-4208-b931-71b68e7e7d39" containerName="nova-metadata-metadata" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.664574 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="208f2c4c-b208-4208-b931-71b68e7e7d39" containerName="nova-metadata-metadata" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.664650 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="208f2c4c-b208-4208-b931-71b68e7e7d39" containerName="nova-metadata-log" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.666191 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.670449 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.712146 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.712400 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.712556 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.712589 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.739017 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc295\" (UniqueName: \"kubernetes.io/projected/c3efca0e-5eb1-40f9-9b3e-407176c576d8-kube-api-access-cc295\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.739191 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.739301 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-config-data\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.739326 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.739474 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3efca0e-5eb1-40f9-9b3e-407176c576d8-logs\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.841179 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.842113 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-config-data\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.842205 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.842309 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3efca0e-5eb1-40f9-9b3e-407176c576d8-logs\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.842480 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc295\" (UniqueName: \"kubernetes.io/projected/c3efca0e-5eb1-40f9-9b3e-407176c576d8-kube-api-access-cc295\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.843645 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3efca0e-5eb1-40f9-9b3e-407176c576d8-logs\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.845383 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.846523 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-config-data\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.848930 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:38 crc kubenswrapper[4772]: I0930 17:24:38.863888 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc295\" (UniqueName: \"kubernetes.io/projected/c3efca0e-5eb1-40f9-9b3e-407176c576d8-kube-api-access-cc295\") pod \"nova-metadata-0\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " pod="openstack/nova-metadata-0" Sep 30 17:24:39 crc kubenswrapper[4772]: I0930 17:24:39.030147 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:24:39 crc kubenswrapper[4772]: I0930 17:24:39.468195 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:39 crc kubenswrapper[4772]: I0930 17:24:39.603862 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3efca0e-5eb1-40f9-9b3e-407176c576d8","Type":"ContainerStarted","Data":"1c191120b443cdbd7a30964614e767d1d7f5929616efcde7e798f0562f6b2eef"} Sep 30 17:24:39 crc kubenswrapper[4772]: I0930 17:24:39.928946 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="208f2c4c-b208-4208-b931-71b68e7e7d39" path="/var/lib/kubelet/pods/208f2c4c-b208-4208-b931-71b68e7e7d39/volumes" Sep 30 17:24:40 crc kubenswrapper[4772]: I0930 17:24:40.618872 4772 generic.go:334] "Generic (PLEG): container finished" podID="333bd9e9-4bac-49af-9d96-25c2c03cb96a" containerID="130bb11040581823309c0a46d0fca21013d2a85f029bb5f10c13a83101632746" exitCode=0 Sep 30 17:24:40 crc kubenswrapper[4772]: I0930 17:24:40.619007 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnbhb" event={"ID":"333bd9e9-4bac-49af-9d96-25c2c03cb96a","Type":"ContainerDied","Data":"130bb11040581823309c0a46d0fca21013d2a85f029bb5f10c13a83101632746"} Sep 30 17:24:40 crc kubenswrapper[4772]: I0930 17:24:40.622653 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3efca0e-5eb1-40f9-9b3e-407176c576d8","Type":"ContainerStarted","Data":"bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a"} Sep 30 17:24:40 crc kubenswrapper[4772]: I0930 17:24:40.622691 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3efca0e-5eb1-40f9-9b3e-407176c576d8","Type":"ContainerStarted","Data":"d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572"} Sep 30 17:24:40 crc kubenswrapper[4772]: I0930 17:24:40.654215 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.654192268 podStartE2EDuration="2.654192268s" podCreationTimestamp="2025-09-30 17:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:24:40.653007507 +0000 UTC m=+1381.560020338" watchObservedRunningTime="2025-09-30 17:24:40.654192268 +0000 UTC m=+1381.561205099" Sep 30 17:24:40 crc kubenswrapper[4772]: I0930 17:24:40.857620 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:24:40 crc kubenswrapper[4772]: I0930 17:24:40.857693 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.113322 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.129365 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.166156 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.166402 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.186435 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69bb54587f-wpxpd"] Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.186717 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" podUID="bf848a80-86ac-41c1-a85e-5cc4fb6e4192" containerName="dnsmasq-dns" containerID="cri-o://19a824243e9fc35e89d39cd88bfa75befda3372aa2ef243efd7a59b2eb133d9c" gracePeriod=10 Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.227917 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.646902 4772 generic.go:334] "Generic (PLEG): container finished" podID="bf848a80-86ac-41c1-a85e-5cc4fb6e4192" containerID="19a824243e9fc35e89d39cd88bfa75befda3372aa2ef243efd7a59b2eb133d9c" exitCode=0 Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.647146 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" event={"ID":"bf848a80-86ac-41c1-a85e-5cc4fb6e4192","Type":"ContainerDied","Data":"19a824243e9fc35e89d39cd88bfa75befda3372aa2ef243efd7a59b2eb133d9c"} Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.701039 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.845676 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.941314 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 17:24:41 crc kubenswrapper[4772]: I0930 17:24:41.941415 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.009840 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-config\") pod \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.009976 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-dns-svc\") pod \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.010070 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-nb\") pod \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.010106 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-786rv\" (UniqueName: \"kubernetes.io/projected/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-kube-api-access-786rv\") pod \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.010188 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-sb\") pod \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\" (UID: \"bf848a80-86ac-41c1-a85e-5cc4fb6e4192\") " Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.021957 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-kube-api-access-786rv" (OuterVolumeSpecName: "kube-api-access-786rv") pod "bf848a80-86ac-41c1-a85e-5cc4fb6e4192" (UID: "bf848a80-86ac-41c1-a85e-5cc4fb6e4192"). InnerVolumeSpecName "kube-api-access-786rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.068629 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-config" (OuterVolumeSpecName: "config") pod "bf848a80-86ac-41c1-a85e-5cc4fb6e4192" (UID: "bf848a80-86ac-41c1-a85e-5cc4fb6e4192"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.074635 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf848a80-86ac-41c1-a85e-5cc4fb6e4192" (UID: "bf848a80-86ac-41c1-a85e-5cc4fb6e4192"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.104162 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf848a80-86ac-41c1-a85e-5cc4fb6e4192" (UID: "bf848a80-86ac-41c1-a85e-5cc4fb6e4192"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.107507 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf848a80-86ac-41c1-a85e-5cc4fb6e4192" (UID: "bf848a80-86ac-41c1-a85e-5cc4fb6e4192"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.112529 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.112564 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-786rv\" (UniqueName: \"kubernetes.io/projected/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-kube-api-access-786rv\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.112573 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.112583 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.112591 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf848a80-86ac-41c1-a85e-5cc4fb6e4192-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.118222 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.213866 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-combined-ca-bundle\") pod \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.213952 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f46n4\" (UniqueName: \"kubernetes.io/projected/333bd9e9-4bac-49af-9d96-25c2c03cb96a-kube-api-access-f46n4\") pod \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.214132 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-scripts\") pod \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.214225 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-config-data\") pod \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\" (UID: \"333bd9e9-4bac-49af-9d96-25c2c03cb96a\") " Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.226593 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-scripts" (OuterVolumeSpecName: "scripts") pod "333bd9e9-4bac-49af-9d96-25c2c03cb96a" (UID: "333bd9e9-4bac-49af-9d96-25c2c03cb96a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.241605 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333bd9e9-4bac-49af-9d96-25c2c03cb96a-kube-api-access-f46n4" (OuterVolumeSpecName: "kube-api-access-f46n4") pod "333bd9e9-4bac-49af-9d96-25c2c03cb96a" (UID: "333bd9e9-4bac-49af-9d96-25c2c03cb96a"). InnerVolumeSpecName "kube-api-access-f46n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.248417 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-config-data" (OuterVolumeSpecName: "config-data") pod "333bd9e9-4bac-49af-9d96-25c2c03cb96a" (UID: "333bd9e9-4bac-49af-9d96-25c2c03cb96a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.249511 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "333bd9e9-4bac-49af-9d96-25c2c03cb96a" (UID: "333bd9e9-4bac-49af-9d96-25c2c03cb96a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.316858 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.316894 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.316906 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333bd9e9-4bac-49af-9d96-25c2c03cb96a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.316922 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f46n4\" (UniqueName: \"kubernetes.io/projected/333bd9e9-4bac-49af-9d96-25c2c03cb96a-kube-api-access-f46n4\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.664307 4772 generic.go:334] "Generic (PLEG): container finished" podID="7f7d470d-5fe9-4d90-a24b-705f8af5d35d" containerID="6c5c0f407021dbbf16dbddb948491bd9cc0c44ef6d19cee05096b9e686ad98c6" exitCode=0 Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.664340 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7qklz" event={"ID":"7f7d470d-5fe9-4d90-a24b-705f8af5d35d","Type":"ContainerDied","Data":"6c5c0f407021dbbf16dbddb948491bd9cc0c44ef6d19cee05096b9e686ad98c6"} Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.668120 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnbhb" event={"ID":"333bd9e9-4bac-49af-9d96-25c2c03cb96a","Type":"ContainerDied","Data":"cd1fa5a046bcbc9bfdf4392f3c7c39657483898f416fa3e2af7a59cfcc775933"} Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.668174 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd1fa5a046bcbc9bfdf4392f3c7c39657483898f416fa3e2af7a59cfcc775933" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.668247 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnbhb" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.675745 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.679871 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69bb54587f-wpxpd" event={"ID":"bf848a80-86ac-41c1-a85e-5cc4fb6e4192","Type":"ContainerDied","Data":"2105055913c3aea0e4fb8ee22947cfdca46f50309efffb7be011974da2aa3993"} Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.679934 4772 scope.go:117] "RemoveContainer" containerID="19a824243e9fc35e89d39cd88bfa75befda3372aa2ef243efd7a59b2eb133d9c" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.718529 4772 scope.go:117] "RemoveContainer" containerID="73d7693961fa0263231f72a02cb4626cc42303b8b633abe6832e335b3e51a4e8" Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.736195 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69bb54587f-wpxpd"] Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.819870 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69bb54587f-wpxpd"] Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.834004 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.834251 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerName="nova-api-log" containerID="cri-o://4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b" gracePeriod=30 Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.834729 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerName="nova-api-api" containerID="cri-o://cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828" gracePeriod=30 Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.853477 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.853725 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" containerName="nova-metadata-log" containerID="cri-o://d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572" gracePeriod=30 Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.854416 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" containerName="nova-metadata-metadata" containerID="cri-o://bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a" gracePeriod=30 Sep 30 17:24:42 crc kubenswrapper[4772]: I0930 17:24:42.867558 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.429903 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.551614 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-config-data\") pod \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.551982 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc295\" (UniqueName: \"kubernetes.io/projected/c3efca0e-5eb1-40f9-9b3e-407176c576d8-kube-api-access-cc295\") pod \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.552152 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-combined-ca-bundle\") pod \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.552248 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3efca0e-5eb1-40f9-9b3e-407176c576d8-logs\") pod \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.552652 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-nova-metadata-tls-certs\") pod \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\" (UID: \"c3efca0e-5eb1-40f9-9b3e-407176c576d8\") " Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.552574 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3efca0e-5eb1-40f9-9b3e-407176c576d8-logs" (OuterVolumeSpecName: "logs") pod "c3efca0e-5eb1-40f9-9b3e-407176c576d8" (UID: "c3efca0e-5eb1-40f9-9b3e-407176c576d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.553378 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3efca0e-5eb1-40f9-9b3e-407176c576d8-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.561991 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3efca0e-5eb1-40f9-9b3e-407176c576d8-kube-api-access-cc295" (OuterVolumeSpecName: "kube-api-access-cc295") pod "c3efca0e-5eb1-40f9-9b3e-407176c576d8" (UID: "c3efca0e-5eb1-40f9-9b3e-407176c576d8"). InnerVolumeSpecName "kube-api-access-cc295". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.587257 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3efca0e-5eb1-40f9-9b3e-407176c576d8" (UID: "c3efca0e-5eb1-40f9-9b3e-407176c576d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.590030 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-config-data" (OuterVolumeSpecName: "config-data") pod "c3efca0e-5eb1-40f9-9b3e-407176c576d8" (UID: "c3efca0e-5eb1-40f9-9b3e-407176c576d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.630148 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c3efca0e-5eb1-40f9-9b3e-407176c576d8" (UID: "c3efca0e-5eb1-40f9-9b3e-407176c576d8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.654944 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.654983 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc295\" (UniqueName: \"kubernetes.io/projected/c3efca0e-5eb1-40f9-9b3e-407176c576d8-kube-api-access-cc295\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.654995 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.655007 4772 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3efca0e-5eb1-40f9-9b3e-407176c576d8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.685959 4772 generic.go:334] "Generic (PLEG): container finished" podID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerID="4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b" exitCode=143 Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.686046 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5606ec35-7419-4109-ab7d-20cf4b1d4562","Type":"ContainerDied","Data":"4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b"} Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.689189 4772 generic.go:334] "Generic (PLEG): container finished" podID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" containerID="bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a" exitCode=0 Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.689287 4772 generic.go:334] "Generic (PLEG): container finished" podID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" containerID="d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572" exitCode=143 Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.690332 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.691275 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3efca0e-5eb1-40f9-9b3e-407176c576d8","Type":"ContainerDied","Data":"bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a"} Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.691360 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3efca0e-5eb1-40f9-9b3e-407176c576d8","Type":"ContainerDied","Data":"d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572"} Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.691375 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3efca0e-5eb1-40f9-9b3e-407176c576d8","Type":"ContainerDied","Data":"1c191120b443cdbd7a30964614e767d1d7f5929616efcde7e798f0562f6b2eef"} Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.691396 4772 scope.go:117] "RemoveContainer" containerID="bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.735275 4772 scope.go:117] "RemoveContainer" containerID="d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.738745 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.763632 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.775762 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:43 crc kubenswrapper[4772]: E0930 17:24:43.776763 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333bd9e9-4bac-49af-9d96-25c2c03cb96a" containerName="nova-manage" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.776822 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="333bd9e9-4bac-49af-9d96-25c2c03cb96a" containerName="nova-manage" Sep 30 17:24:43 crc kubenswrapper[4772]: E0930 17:24:43.776844 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" containerName="nova-metadata-metadata" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.776868 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" containerName="nova-metadata-metadata" Sep 30 17:24:43 crc kubenswrapper[4772]: E0930 17:24:43.776899 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf848a80-86ac-41c1-a85e-5cc4fb6e4192" containerName="dnsmasq-dns" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.776906 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf848a80-86ac-41c1-a85e-5cc4fb6e4192" containerName="dnsmasq-dns" Sep 30 17:24:43 crc kubenswrapper[4772]: E0930 17:24:43.776917 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf848a80-86ac-41c1-a85e-5cc4fb6e4192" containerName="init" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.776922 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf848a80-86ac-41c1-a85e-5cc4fb6e4192" containerName="init" Sep 30 17:24:43 crc kubenswrapper[4772]: E0930 17:24:43.776939 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" containerName="nova-metadata-log" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.776945 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" containerName="nova-metadata-log" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.777138 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="333bd9e9-4bac-49af-9d96-25c2c03cb96a" containerName="nova-manage" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.777157 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" containerName="nova-metadata-log" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.777166 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" containerName="nova-metadata-metadata" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.777177 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf848a80-86ac-41c1-a85e-5cc4fb6e4192" containerName="dnsmasq-dns" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.778662 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.782981 4772 scope.go:117] "RemoveContainer" containerID="bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.783259 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.783545 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 17:24:43 crc kubenswrapper[4772]: E0930 17:24:43.783752 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a\": container with ID starting with bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a not found: ID does not exist" containerID="bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.783858 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a"} err="failed to get container status \"bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a\": rpc error: code = NotFound desc = could not find container \"bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a\": container with ID starting with bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a not found: ID does not exist" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.783932 4772 scope.go:117] "RemoveContainer" containerID="d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572" Sep 30 17:24:43 crc kubenswrapper[4772]: E0930 17:24:43.791104 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572\": container with ID starting with d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572 not found: ID does not exist" containerID="d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.791354 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572"} err="failed to get container status \"d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572\": rpc error: code = NotFound desc = could not find container \"d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572\": container with ID starting with d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572 not found: ID does not exist" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.791448 4772 scope.go:117] "RemoveContainer" containerID="bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.792263 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a"} err="failed to get container status \"bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a\": rpc error: code = NotFound desc = could not find container \"bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a\": container with ID starting with bbb6ea7da18241b4dc7415c03036466aec2ff202f8526d6b41b63f300b2b911a not found: ID does not exist" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.792315 4772 scope.go:117] "RemoveContainer" containerID="d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.792908 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572"} err="failed to get container status \"d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572\": rpc error: code = NotFound desc = could not find container \"d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572\": container with ID starting with d3f80e69561a11ad3e730ba55f5669ede42c3e9b24c45556000572b611699572 not found: ID does not exist" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.813629 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.859259 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbl5q\" (UniqueName: \"kubernetes.io/projected/6f7babc7-57a9-4eed-a69d-75498c70f2d9-kube-api-access-wbl5q\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.859336 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.859468 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.859488 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-config-data\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.859513 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7babc7-57a9-4eed-a69d-75498c70f2d9-logs\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.912800 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf848a80-86ac-41c1-a85e-5cc4fb6e4192" path="/var/lib/kubelet/pods/bf848a80-86ac-41c1-a85e-5cc4fb6e4192/volumes" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.913614 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3efca0e-5eb1-40f9-9b3e-407176c576d8" path="/var/lib/kubelet/pods/c3efca0e-5eb1-40f9-9b3e-407176c576d8/volumes" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.961747 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbl5q\" (UniqueName: \"kubernetes.io/projected/6f7babc7-57a9-4eed-a69d-75498c70f2d9-kube-api-access-wbl5q\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.961849 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.962155 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.962175 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-config-data\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.962212 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7babc7-57a9-4eed-a69d-75498c70f2d9-logs\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.962662 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7babc7-57a9-4eed-a69d-75498c70f2d9-logs\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.967562 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.967928 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-config-data\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.977818 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:43 crc kubenswrapper[4772]: I0930 17:24:43.980638 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbl5q\" (UniqueName: \"kubernetes.io/projected/6f7babc7-57a9-4eed-a69d-75498c70f2d9-kube-api-access-wbl5q\") pod \"nova-metadata-0\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " pod="openstack/nova-metadata-0" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.060215 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.100243 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.165184 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-scripts\") pod \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.165334 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwrgs\" (UniqueName: \"kubernetes.io/projected/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-kube-api-access-gwrgs\") pod \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.165450 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-combined-ca-bundle\") pod \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.165710 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-config-data\") pod \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\" (UID: \"7f7d470d-5fe9-4d90-a24b-705f8af5d35d\") " Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.168582 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-scripts" (OuterVolumeSpecName: "scripts") pod "7f7d470d-5fe9-4d90-a24b-705f8af5d35d" (UID: "7f7d470d-5fe9-4d90-a24b-705f8af5d35d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.169290 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-kube-api-access-gwrgs" (OuterVolumeSpecName: "kube-api-access-gwrgs") pod "7f7d470d-5fe9-4d90-a24b-705f8af5d35d" (UID: "7f7d470d-5fe9-4d90-a24b-705f8af5d35d"). InnerVolumeSpecName "kube-api-access-gwrgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.197177 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-config-data" (OuterVolumeSpecName: "config-data") pod "7f7d470d-5fe9-4d90-a24b-705f8af5d35d" (UID: "7f7d470d-5fe9-4d90-a24b-705f8af5d35d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.198355 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f7d470d-5fe9-4d90-a24b-705f8af5d35d" (UID: "7f7d470d-5fe9-4d90-a24b-705f8af5d35d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.269739 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwrgs\" (UniqueName: \"kubernetes.io/projected/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-kube-api-access-gwrgs\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.269787 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.269803 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.269816 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7d470d-5fe9-4d90-a24b-705f8af5d35d-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.558514 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.706048 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f7babc7-57a9-4eed-a69d-75498c70f2d9","Type":"ContainerStarted","Data":"3408e19e00dd6471ee533f0732d295738b8ebb8ac4ebfe8b48db8c77b18be461"} Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.723178 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7qklz" event={"ID":"7f7d470d-5fe9-4d90-a24b-705f8af5d35d","Type":"ContainerDied","Data":"bc14500b96525c8fe3652206cfbff60566980213a8c5530de0f2689cb6ac968e"} Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.723253 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc14500b96525c8fe3652206cfbff60566980213a8c5530de0f2689cb6ac968e" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.723343 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7qklz" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.727808 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bd69046b-895b-43d3-99a2-15d6f1edcfa4" containerName="nova-scheduler-scheduler" containerID="cri-o://befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd" gracePeriod=30 Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.775916 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 17:24:44 crc kubenswrapper[4772]: E0930 17:24:44.776726 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7d470d-5fe9-4d90-a24b-705f8af5d35d" containerName="nova-cell1-conductor-db-sync" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.776754 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7d470d-5fe9-4d90-a24b-705f8af5d35d" containerName="nova-cell1-conductor-db-sync" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.777012 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f7d470d-5fe9-4d90-a24b-705f8af5d35d" containerName="nova-cell1-conductor-db-sync" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.778001 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.781970 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.785975 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.882004 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6b14d4-5d20-4cae-add5-4702dc26ecc5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"be6b14d4-5d20-4cae-add5-4702dc26ecc5\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.882250 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6b14d4-5d20-4cae-add5-4702dc26ecc5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"be6b14d4-5d20-4cae-add5-4702dc26ecc5\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.882472 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmc94\" (UniqueName: \"kubernetes.io/projected/be6b14d4-5d20-4cae-add5-4702dc26ecc5-kube-api-access-qmc94\") pod \"nova-cell1-conductor-0\" (UID: \"be6b14d4-5d20-4cae-add5-4702dc26ecc5\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.983918 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6b14d4-5d20-4cae-add5-4702dc26ecc5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"be6b14d4-5d20-4cae-add5-4702dc26ecc5\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.984020 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6b14d4-5d20-4cae-add5-4702dc26ecc5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"be6b14d4-5d20-4cae-add5-4702dc26ecc5\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.984218 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmc94\" (UniqueName: \"kubernetes.io/projected/be6b14d4-5d20-4cae-add5-4702dc26ecc5-kube-api-access-qmc94\") pod \"nova-cell1-conductor-0\" (UID: \"be6b14d4-5d20-4cae-add5-4702dc26ecc5\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.991069 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6b14d4-5d20-4cae-add5-4702dc26ecc5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"be6b14d4-5d20-4cae-add5-4702dc26ecc5\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:44 crc kubenswrapper[4772]: I0930 17:24:44.991215 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6b14d4-5d20-4cae-add5-4702dc26ecc5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"be6b14d4-5d20-4cae-add5-4702dc26ecc5\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:45 crc kubenswrapper[4772]: I0930 17:24:45.002152 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmc94\" (UniqueName: \"kubernetes.io/projected/be6b14d4-5d20-4cae-add5-4702dc26ecc5-kube-api-access-qmc94\") pod \"nova-cell1-conductor-0\" (UID: \"be6b14d4-5d20-4cae-add5-4702dc26ecc5\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:45 crc kubenswrapper[4772]: I0930 17:24:45.193122 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:45 crc kubenswrapper[4772]: I0930 17:24:45.684863 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 17:24:45 crc kubenswrapper[4772]: I0930 17:24:45.743747 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f7babc7-57a9-4eed-a69d-75498c70f2d9","Type":"ContainerStarted","Data":"8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d"} Sep 30 17:24:45 crc kubenswrapper[4772]: I0930 17:24:45.743794 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f7babc7-57a9-4eed-a69d-75498c70f2d9","Type":"ContainerStarted","Data":"ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4"} Sep 30 17:24:45 crc kubenswrapper[4772]: I0930 17:24:45.746374 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"be6b14d4-5d20-4cae-add5-4702dc26ecc5","Type":"ContainerStarted","Data":"669f37876b2c62aee5871205f3ea7d382fd82a2ad6e3dce428f59871181ff3ee"} Sep 30 17:24:45 crc kubenswrapper[4772]: I0930 17:24:45.769306 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.769287443 podStartE2EDuration="2.769287443s" podCreationTimestamp="2025-09-30 17:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:24:45.767774453 +0000 UTC m=+1386.674787284" watchObservedRunningTime="2025-09-30 17:24:45.769287443 +0000 UTC m=+1386.676300274" Sep 30 17:24:46 crc kubenswrapper[4772]: E0930 17:24:46.180750 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:24:46 crc kubenswrapper[4772]: E0930 17:24:46.185472 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:24:46 crc kubenswrapper[4772]: E0930 17:24:46.187853 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:24:46 crc kubenswrapper[4772]: E0930 17:24:46.187955 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bd69046b-895b-43d3-99a2-15d6f1edcfa4" containerName="nova-scheduler-scheduler" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.457629 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.516519 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-combined-ca-bundle\") pod \"5606ec35-7419-4109-ab7d-20cf4b1d4562\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.516666 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-config-data\") pod \"5606ec35-7419-4109-ab7d-20cf4b1d4562\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.516730 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jctnb\" (UniqueName: \"kubernetes.io/projected/5606ec35-7419-4109-ab7d-20cf4b1d4562-kube-api-access-jctnb\") pod \"5606ec35-7419-4109-ab7d-20cf4b1d4562\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.516895 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5606ec35-7419-4109-ab7d-20cf4b1d4562-logs\") pod \"5606ec35-7419-4109-ab7d-20cf4b1d4562\" (UID: \"5606ec35-7419-4109-ab7d-20cf4b1d4562\") " Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.520004 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5606ec35-7419-4109-ab7d-20cf4b1d4562-logs" (OuterVolumeSpecName: "logs") pod "5606ec35-7419-4109-ab7d-20cf4b1d4562" (UID: "5606ec35-7419-4109-ab7d-20cf4b1d4562"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.524407 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5606ec35-7419-4109-ab7d-20cf4b1d4562-kube-api-access-jctnb" (OuterVolumeSpecName: "kube-api-access-jctnb") pod "5606ec35-7419-4109-ab7d-20cf4b1d4562" (UID: "5606ec35-7419-4109-ab7d-20cf4b1d4562"). InnerVolumeSpecName "kube-api-access-jctnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.549571 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-config-data" (OuterVolumeSpecName: "config-data") pod "5606ec35-7419-4109-ab7d-20cf4b1d4562" (UID: "5606ec35-7419-4109-ab7d-20cf4b1d4562"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.557357 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5606ec35-7419-4109-ab7d-20cf4b1d4562" (UID: "5606ec35-7419-4109-ab7d-20cf4b1d4562"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.619566 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.619606 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5606ec35-7419-4109-ab7d-20cf4b1d4562-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.619619 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jctnb\" (UniqueName: \"kubernetes.io/projected/5606ec35-7419-4109-ab7d-20cf4b1d4562-kube-api-access-jctnb\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.619630 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5606ec35-7419-4109-ab7d-20cf4b1d4562-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.785827 4772 generic.go:334] "Generic (PLEG): container finished" podID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerID="cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828" exitCode=0 Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.785894 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5606ec35-7419-4109-ab7d-20cf4b1d4562","Type":"ContainerDied","Data":"cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828"} Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.785924 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5606ec35-7419-4109-ab7d-20cf4b1d4562","Type":"ContainerDied","Data":"a18bc124fc90795151635d873570d88cb3fe4819f9156919aaee1ea0ccf13c4b"} Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.785944 4772 scope.go:117] "RemoveContainer" containerID="cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.786452 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.792224 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"be6b14d4-5d20-4cae-add5-4702dc26ecc5","Type":"ContainerStarted","Data":"f70ab6664db60aa20e8e60f8de24013b5470c53274e473f7cd60ed18ca8bf971"} Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.792345 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.824045 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.824024733 podStartE2EDuration="2.824024733s" podCreationTimestamp="2025-09-30 17:24:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:24:46.816855075 +0000 UTC m=+1387.723867906" watchObservedRunningTime="2025-09-30 17:24:46.824024733 +0000 UTC m=+1387.731037564" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.843598 4772 scope.go:117] "RemoveContainer" containerID="4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.866377 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.893874 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.897499 4772 scope.go:117] "RemoveContainer" containerID="cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828" Sep 30 17:24:46 crc kubenswrapper[4772]: E0930 17:24:46.919258 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828\": container with ID starting with cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828 not found: ID does not exist" containerID="cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.919320 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828"} err="failed to get container status \"cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828\": rpc error: code = NotFound desc = could not find container \"cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828\": container with ID starting with cbedb521c82c7373241983d465dbe163bc3651def38fa236cfd2b1c14641a828 not found: ID does not exist" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.919351 4772 scope.go:117] "RemoveContainer" containerID="4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.940552 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 17:24:46 crc kubenswrapper[4772]: E0930 17:24:46.941442 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerName="nova-api-api" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.941466 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerName="nova-api-api" Sep 30 17:24:46 crc kubenswrapper[4772]: E0930 17:24:46.941509 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerName="nova-api-log" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.941518 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerName="nova-api-log" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.941750 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerName="nova-api-api" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.941771 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" containerName="nova-api-log" Sep 30 17:24:46 crc kubenswrapper[4772]: E0930 17:24:46.941999 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b\": container with ID starting with 4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b not found: ID does not exist" containerID="4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.942050 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b"} err="failed to get container status \"4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b\": rpc error: code = NotFound desc = could not find container \"4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b\": container with ID starting with 4dcce53b4fb1fab98c3b823a4abff13377d53f3a82580820ccbd0364539cdc5b not found: ID does not exist" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.943975 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.948931 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 17:24:46 crc kubenswrapper[4772]: I0930 17:24:46.973330 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.027162 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.027231 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-config-data\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.027290 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfsdb\" (UniqueName: \"kubernetes.io/projected/75594866-d94b-4605-a502-77fe533e0407-kube-api-access-gfsdb\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.027329 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75594866-d94b-4605-a502-77fe533e0407-logs\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.129325 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfsdb\" (UniqueName: \"kubernetes.io/projected/75594866-d94b-4605-a502-77fe533e0407-kube-api-access-gfsdb\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.129417 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75594866-d94b-4605-a502-77fe533e0407-logs\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.129517 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.129578 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-config-data\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.130113 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75594866-d94b-4605-a502-77fe533e0407-logs\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.134936 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-config-data\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.144920 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.146289 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfsdb\" (UniqueName: \"kubernetes.io/projected/75594866-d94b-4605-a502-77fe533e0407-kube-api-access-gfsdb\") pod \"nova-api-0\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.288887 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.741203 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.804687 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75594866-d94b-4605-a502-77fe533e0407","Type":"ContainerStarted","Data":"a819bd5f606db1d0fe616d121ffd67ab2f434d46b8f40de46339f1007406a228"} Sep 30 17:24:47 crc kubenswrapper[4772]: I0930 17:24:47.912874 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5606ec35-7419-4109-ab7d-20cf4b1d4562" path="/var/lib/kubelet/pods/5606ec35-7419-4109-ab7d-20cf4b1d4562/volumes" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.624275 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.661994 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7jtw\" (UniqueName: \"kubernetes.io/projected/bd69046b-895b-43d3-99a2-15d6f1edcfa4-kube-api-access-z7jtw\") pod \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.662312 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-config-data\") pod \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.662418 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-combined-ca-bundle\") pod \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\" (UID: \"bd69046b-895b-43d3-99a2-15d6f1edcfa4\") " Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.704136 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd69046b-895b-43d3-99a2-15d6f1edcfa4" (UID: "bd69046b-895b-43d3-99a2-15d6f1edcfa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.707463 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd69046b-895b-43d3-99a2-15d6f1edcfa4-kube-api-access-z7jtw" (OuterVolumeSpecName: "kube-api-access-z7jtw") pod "bd69046b-895b-43d3-99a2-15d6f1edcfa4" (UID: "bd69046b-895b-43d3-99a2-15d6f1edcfa4"). InnerVolumeSpecName "kube-api-access-z7jtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.716742 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-config-data" (OuterVolumeSpecName: "config-data") pod "bd69046b-895b-43d3-99a2-15d6f1edcfa4" (UID: "bd69046b-895b-43d3-99a2-15d6f1edcfa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.765779 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.766102 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7jtw\" (UniqueName: \"kubernetes.io/projected/bd69046b-895b-43d3-99a2-15d6f1edcfa4-kube-api-access-z7jtw\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.766194 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd69046b-895b-43d3-99a2-15d6f1edcfa4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.815070 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75594866-d94b-4605-a502-77fe533e0407","Type":"ContainerStarted","Data":"8c098d680cb2ebcb3d9ac3c370be9cf70ad97edd28d706c566855c051668f424"} Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.815127 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75594866-d94b-4605-a502-77fe533e0407","Type":"ContainerStarted","Data":"af785359dfac7b973f17fc3b9c0bbafe6b6f06f284c6021695588752450349af"} Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.816430 4772 generic.go:334] "Generic (PLEG): container finished" podID="bd69046b-895b-43d3-99a2-15d6f1edcfa4" containerID="befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd" exitCode=0 Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.816691 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd69046b-895b-43d3-99a2-15d6f1edcfa4","Type":"ContainerDied","Data":"befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd"} Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.816725 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd69046b-895b-43d3-99a2-15d6f1edcfa4","Type":"ContainerDied","Data":"e6daad2a1e15e2bb01be5b512446dc8229b6f9012da0427727b4a0ed5ca3a2f5"} Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.816747 4772 scope.go:117] "RemoveContainer" containerID="befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.816881 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.851314 4772 scope.go:117] "RemoveContainer" containerID="befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd" Sep 30 17:24:48 crc kubenswrapper[4772]: E0930 17:24:48.852937 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd\": container with ID starting with befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd not found: ID does not exist" containerID="befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.853074 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd"} err="failed to get container status \"befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd\": rpc error: code = NotFound desc = could not find container \"befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd\": container with ID starting with befc77fe37cf06ef14dd5ae90833d9326e230f4e0758ee657625576d00b229bd not found: ID does not exist" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.858903 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.858881028 podStartE2EDuration="2.858881028s" podCreationTimestamp="2025-09-30 17:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:24:48.851579976 +0000 UTC m=+1389.758592797" watchObservedRunningTime="2025-09-30 17:24:48.858881028 +0000 UTC m=+1389.765893859" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.877232 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.890762 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.901331 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:24:48 crc kubenswrapper[4772]: E0930 17:24:48.901790 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd69046b-895b-43d3-99a2-15d6f1edcfa4" containerName="nova-scheduler-scheduler" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.901812 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd69046b-895b-43d3-99a2-15d6f1edcfa4" containerName="nova-scheduler-scheduler" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.902091 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd69046b-895b-43d3-99a2-15d6f1edcfa4" containerName="nova-scheduler-scheduler" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.902740 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.905259 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.915915 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.971445 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-config-data\") pod \"nova-scheduler-0\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.971532 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf94h\" (UniqueName: \"kubernetes.io/projected/221815bf-6b6e-4241-8dc2-6591acff3e68-kube-api-access-rf94h\") pod \"nova-scheduler-0\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:48 crc kubenswrapper[4772]: I0930 17:24:48.971611 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.073617 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-config-data\") pod \"nova-scheduler-0\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.073711 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf94h\" (UniqueName: \"kubernetes.io/projected/221815bf-6b6e-4241-8dc2-6591acff3e68-kube-api-access-rf94h\") pod \"nova-scheduler-0\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.073766 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.077873 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.077941 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-config-data\") pod \"nova-scheduler-0\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.093251 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf94h\" (UniqueName: \"kubernetes.io/projected/221815bf-6b6e-4241-8dc2-6591acff3e68-kube-api-access-rf94h\") pod \"nova-scheduler-0\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " pod="openstack/nova-scheduler-0" Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.100561 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.100873 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.225204 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.670661 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.830870 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"221815bf-6b6e-4241-8dc2-6591acff3e68","Type":"ContainerStarted","Data":"69318a05ebf5d504e25f212c9c750c38ba024c0c8a532f4731eb45c5e6ca6712"} Sep 30 17:24:49 crc kubenswrapper[4772]: I0930 17:24:49.910036 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd69046b-895b-43d3-99a2-15d6f1edcfa4" path="/var/lib/kubelet/pods/bd69046b-895b-43d3-99a2-15d6f1edcfa4/volumes" Sep 30 17:24:50 crc kubenswrapper[4772]: I0930 17:24:50.222306 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 17:24:50 crc kubenswrapper[4772]: I0930 17:24:50.840536 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"221815bf-6b6e-4241-8dc2-6591acff3e68","Type":"ContainerStarted","Data":"0a751948491b112a21ff285cb9e546fcdbfc7eee8abf0d51c5f63569c85e98bc"} Sep 30 17:24:50 crc kubenswrapper[4772]: I0930 17:24:50.864011 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8639450589999997 podStartE2EDuration="2.863945059s" podCreationTimestamp="2025-09-30 17:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:24:50.858529267 +0000 UTC m=+1391.765542148" watchObservedRunningTime="2025-09-30 17:24:50.863945059 +0000 UTC m=+1391.770957900" Sep 30 17:24:54 crc kubenswrapper[4772]: I0930 17:24:54.101161 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 17:24:54 crc kubenswrapper[4772]: I0930 17:24:54.101473 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 17:24:54 crc kubenswrapper[4772]: I0930 17:24:54.225523 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 17:24:55 crc kubenswrapper[4772]: I0930 17:24:55.112263 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:24:55 crc kubenswrapper[4772]: I0930 17:24:55.112270 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:24:57 crc kubenswrapper[4772]: I0930 17:24:57.290217 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:24:57 crc kubenswrapper[4772]: I0930 17:24:57.290826 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:24:58 crc kubenswrapper[4772]: I0930 17:24:58.372344 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="75594866-d94b-4605-a502-77fe533e0407" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 17:24:58 crc kubenswrapper[4772]: I0930 17:24:58.372484 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="75594866-d94b-4605-a502-77fe533e0407" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 17:24:59 crc kubenswrapper[4772]: I0930 17:24:59.226233 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 17:24:59 crc kubenswrapper[4772]: I0930 17:24:59.259623 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 17:24:59 crc kubenswrapper[4772]: I0930 17:24:59.976795 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 17:25:04 crc kubenswrapper[4772]: I0930 17:25:04.106495 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 17:25:04 crc kubenswrapper[4772]: I0930 17:25:04.109287 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 17:25:04 crc kubenswrapper[4772]: I0930 17:25:04.116729 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 17:25:05 crc kubenswrapper[4772]: I0930 17:25:05.008349 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 17:25:06 crc kubenswrapper[4772]: I0930 17:25:06.948700 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.022620 4772 generic.go:334] "Generic (PLEG): container finished" podID="0a742f77-0412-4089-85e8-f78cfef69aff" containerID="56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3" exitCode=137 Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.022687 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a742f77-0412-4089-85e8-f78cfef69aff","Type":"ContainerDied","Data":"56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3"} Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.022745 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a742f77-0412-4089-85e8-f78cfef69aff","Type":"ContainerDied","Data":"1f343d88ab35403509a3bff26547009b52ae2acb583f8963bd81ec69ade37890"} Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.022763 4772 scope.go:117] "RemoveContainer" containerID="56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.022817 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.044394 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-config-data\") pod \"0a742f77-0412-4089-85e8-f78cfef69aff\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.044514 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-combined-ca-bundle\") pod \"0a742f77-0412-4089-85e8-f78cfef69aff\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.044607 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n86q\" (UniqueName: \"kubernetes.io/projected/0a742f77-0412-4089-85e8-f78cfef69aff-kube-api-access-8n86q\") pod \"0a742f77-0412-4089-85e8-f78cfef69aff\" (UID: \"0a742f77-0412-4089-85e8-f78cfef69aff\") " Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.051142 4772 scope.go:117] "RemoveContainer" containerID="56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3" Sep 30 17:25:07 crc kubenswrapper[4772]: E0930 17:25:07.051867 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3\": container with ID starting with 56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3 not found: ID does not exist" containerID="56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.051945 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3"} err="failed to get container status \"56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3\": rpc error: code = NotFound desc = could not find container \"56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3\": container with ID starting with 56139c1106dd50b8e9f0075a0213bb7cf71b0f5fca3c88ab3a1f3b32fb1413c3 not found: ID does not exist" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.055645 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a742f77-0412-4089-85e8-f78cfef69aff-kube-api-access-8n86q" (OuterVolumeSpecName: "kube-api-access-8n86q") pod "0a742f77-0412-4089-85e8-f78cfef69aff" (UID: "0a742f77-0412-4089-85e8-f78cfef69aff"). InnerVolumeSpecName "kube-api-access-8n86q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.084552 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a742f77-0412-4089-85e8-f78cfef69aff" (UID: "0a742f77-0412-4089-85e8-f78cfef69aff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.084964 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-config-data" (OuterVolumeSpecName: "config-data") pod "0a742f77-0412-4089-85e8-f78cfef69aff" (UID: "0a742f77-0412-4089-85e8-f78cfef69aff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.147237 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.147592 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a742f77-0412-4089-85e8-f78cfef69aff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.147680 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n86q\" (UniqueName: \"kubernetes.io/projected/0a742f77-0412-4089-85e8-f78cfef69aff-kube-api-access-8n86q\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.295591 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.295978 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.296143 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.302099 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.358517 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.373834 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.388606 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:25:07 crc kubenswrapper[4772]: E0930 17:25:07.389158 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a742f77-0412-4089-85e8-f78cfef69aff" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.389177 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a742f77-0412-4089-85e8-f78cfef69aff" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.389448 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a742f77-0412-4089-85e8-f78cfef69aff" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.390231 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.393071 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.393305 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.393425 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.397633 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.452667 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.455199 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.455290 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh9d5\" (UniqueName: \"kubernetes.io/projected/21aeeee6-b52d-4cd0-b635-085708b6e9d9-kube-api-access-mh9d5\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.455409 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.455478 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.557723 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.557804 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.557905 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.557940 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.557991 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh9d5\" (UniqueName: \"kubernetes.io/projected/21aeeee6-b52d-4cd0-b635-085708b6e9d9-kube-api-access-mh9d5\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.562500 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.563166 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.563292 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.563354 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21aeeee6-b52d-4cd0-b635-085708b6e9d9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.574781 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh9d5\" (UniqueName: \"kubernetes.io/projected/21aeeee6-b52d-4cd0-b635-085708b6e9d9-kube-api-access-mh9d5\") pod \"nova-cell1-novncproxy-0\" (UID: \"21aeeee6-b52d-4cd0-b635-085708b6e9d9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.757111 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:07 crc kubenswrapper[4772]: I0930 17:25:07.907856 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a742f77-0412-4089-85e8-f78cfef69aff" path="/var/lib/kubelet/pods/0a742f77-0412-4089-85e8-f78cfef69aff/volumes" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.034104 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.039847 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.210570 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fd69b5dbc-nw55h"] Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.212640 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.252112 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd69b5dbc-nw55h"] Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.309018 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.384883 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwdpn\" (UniqueName: \"kubernetes.io/projected/b5d55792-e918-48b2-ab18-344dbd67c4b7-kube-api-access-vwdpn\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.384952 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-sb\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.385029 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-config\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.385101 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-nb\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.385129 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-dns-svc\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.487274 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwdpn\" (UniqueName: \"kubernetes.io/projected/b5d55792-e918-48b2-ab18-344dbd67c4b7-kube-api-access-vwdpn\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.487581 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-sb\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.487785 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-config\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.487921 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-dns-svc\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.487947 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-nb\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.488422 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-sb\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.488570 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-config\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.488986 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-dns-svc\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.492770 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-nb\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.509867 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwdpn\" (UniqueName: \"kubernetes.io/projected/b5d55792-e918-48b2-ab18-344dbd67c4b7-kube-api-access-vwdpn\") pod \"dnsmasq-dns-fd69b5dbc-nw55h\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.556842 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.655283 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.655466 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.655606 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.657228 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0167984dc474e8f0e251ca86d3847ef4b3ab076e2cb16fe9125a3f852650eb68"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.657297 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://0167984dc474e8f0e251ca86d3847ef4b3ab076e2cb16fe9125a3f852650eb68" gracePeriod=600 Sep 30 17:25:08 crc kubenswrapper[4772]: I0930 17:25:08.876672 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd69b5dbc-nw55h"] Sep 30 17:25:09 crc kubenswrapper[4772]: I0930 17:25:09.043432 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21aeeee6-b52d-4cd0-b635-085708b6e9d9","Type":"ContainerStarted","Data":"586ca2d08b8a89e21315d1eacf65053547fff18c88402f5c077160f25dc03bfa"} Sep 30 17:25:09 crc kubenswrapper[4772]: I0930 17:25:09.043486 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21aeeee6-b52d-4cd0-b635-085708b6e9d9","Type":"ContainerStarted","Data":"f012063ca6f5ce09d84a50e249bd0ce4095ef2d0d9d1ed76489e0c6375107d56"} Sep 30 17:25:09 crc kubenswrapper[4772]: I0930 17:25:09.048980 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="0167984dc474e8f0e251ca86d3847ef4b3ab076e2cb16fe9125a3f852650eb68" exitCode=0 Sep 30 17:25:09 crc kubenswrapper[4772]: I0930 17:25:09.049076 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"0167984dc474e8f0e251ca86d3847ef4b3ab076e2cb16fe9125a3f852650eb68"} Sep 30 17:25:09 crc kubenswrapper[4772]: I0930 17:25:09.049158 4772 scope.go:117] "RemoveContainer" containerID="efa0334e5be43d3bffa768f2acb0e43691dcf91743c608a3a66ab0007419afd9" Sep 30 17:25:09 crc kubenswrapper[4772]: I0930 17:25:09.050551 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" event={"ID":"b5d55792-e918-48b2-ab18-344dbd67c4b7","Type":"ContainerStarted","Data":"1e372803af5f129e441cc89756d92a9add73f57d97c83ee72c9e14beee7d3c91"} Sep 30 17:25:09 crc kubenswrapper[4772]: I0930 17:25:09.929694 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.929671428 podStartE2EDuration="2.929671428s" podCreationTimestamp="2025-09-30 17:25:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:25:09.067716348 +0000 UTC m=+1409.974729199" watchObservedRunningTime="2025-09-30 17:25:09.929671428 +0000 UTC m=+1410.836684259" Sep 30 17:25:10 crc kubenswrapper[4772]: I0930 17:25:10.061632 4772 generic.go:334] "Generic (PLEG): container finished" podID="b5d55792-e918-48b2-ab18-344dbd67c4b7" containerID="9260a303bd7cbea98ec5d7d41b6047de570e4fc2efe778b0b02b3ea270899c01" exitCode=0 Sep 30 17:25:10 crc kubenswrapper[4772]: I0930 17:25:10.061720 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" event={"ID":"b5d55792-e918-48b2-ab18-344dbd67c4b7","Type":"ContainerDied","Data":"9260a303bd7cbea98ec5d7d41b6047de570e4fc2efe778b0b02b3ea270899c01"} Sep 30 17:25:10 crc kubenswrapper[4772]: I0930 17:25:10.065900 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef"} Sep 30 17:25:10 crc kubenswrapper[4772]: I0930 17:25:10.607538 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:25:10 crc kubenswrapper[4772]: I0930 17:25:10.608541 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="proxy-httpd" containerID="cri-o://b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0" gracePeriod=30 Sep 30 17:25:10 crc kubenswrapper[4772]: I0930 17:25:10.608557 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="sg-core" containerID="cri-o://0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb" gracePeriod=30 Sep 30 17:25:10 crc kubenswrapper[4772]: I0930 17:25:10.608540 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="ceilometer-notification-agent" containerID="cri-o://4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5" gracePeriod=30 Sep 30 17:25:10 crc kubenswrapper[4772]: I0930 17:25:10.608757 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="ceilometer-central-agent" containerID="cri-o://771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee" gracePeriod=30 Sep 30 17:25:10 crc kubenswrapper[4772]: I0930 17:25:10.957567 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:11 crc kubenswrapper[4772]: I0930 17:25:11.077436 4772 generic.go:334] "Generic (PLEG): container finished" podID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerID="b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0" exitCode=0 Sep 30 17:25:11 crc kubenswrapper[4772]: I0930 17:25:11.077474 4772 generic.go:334] "Generic (PLEG): container finished" podID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerID="0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb" exitCode=2 Sep 30 17:25:11 crc kubenswrapper[4772]: I0930 17:25:11.077508 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43550b7-9e60-4018-99bb-c1ef5c05b022","Type":"ContainerDied","Data":"b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0"} Sep 30 17:25:11 crc kubenswrapper[4772]: I0930 17:25:11.077537 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43550b7-9e60-4018-99bb-c1ef5c05b022","Type":"ContainerDied","Data":"0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb"} Sep 30 17:25:11 crc kubenswrapper[4772]: I0930 17:25:11.080577 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="75594866-d94b-4605-a502-77fe533e0407" containerName="nova-api-log" containerID="cri-o://af785359dfac7b973f17fc3b9c0bbafe6b6f06f284c6021695588752450349af" gracePeriod=30 Sep 30 17:25:11 crc kubenswrapper[4772]: I0930 17:25:11.081188 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="75594866-d94b-4605-a502-77fe533e0407" containerName="nova-api-api" containerID="cri-o://8c098d680cb2ebcb3d9ac3c370be9cf70ad97edd28d706c566855c051668f424" gracePeriod=30 Sep 30 17:25:11 crc kubenswrapper[4772]: I0930 17:25:11.082102 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" event={"ID":"b5d55792-e918-48b2-ab18-344dbd67c4b7","Type":"ContainerStarted","Data":"0ad2b440d72c2f98720dc865977acb06a6446d68be469b949818dfc50c4cb4eb"} Sep 30 17:25:11 crc kubenswrapper[4772]: I0930 17:25:11.082137 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:11 crc kubenswrapper[4772]: I0930 17:25:11.102800 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" podStartSLOduration=3.102778108 podStartE2EDuration="3.102778108s" podCreationTimestamp="2025-09-30 17:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:25:11.102267974 +0000 UTC m=+1412.009280795" watchObservedRunningTime="2025-09-30 17:25:11.102778108 +0000 UTC m=+1412.009790949" Sep 30 17:25:12 crc kubenswrapper[4772]: I0930 17:25:12.103685 4772 generic.go:334] "Generic (PLEG): container finished" podID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerID="771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee" exitCode=0 Sep 30 17:25:12 crc kubenswrapper[4772]: I0930 17:25:12.104222 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43550b7-9e60-4018-99bb-c1ef5c05b022","Type":"ContainerDied","Data":"771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee"} Sep 30 17:25:12 crc kubenswrapper[4772]: I0930 17:25:12.106301 4772 generic.go:334] "Generic (PLEG): container finished" podID="75594866-d94b-4605-a502-77fe533e0407" containerID="af785359dfac7b973f17fc3b9c0bbafe6b6f06f284c6021695588752450349af" exitCode=143 Sep 30 17:25:12 crc kubenswrapper[4772]: I0930 17:25:12.106446 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75594866-d94b-4605-a502-77fe533e0407","Type":"ContainerDied","Data":"af785359dfac7b973f17fc3b9c0bbafe6b6f06f284c6021695588752450349af"} Sep 30 17:25:12 crc kubenswrapper[4772]: I0930 17:25:12.758293 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.119771 4772 generic.go:334] "Generic (PLEG): container finished" podID="75594866-d94b-4605-a502-77fe533e0407" containerID="8c098d680cb2ebcb3d9ac3c370be9cf70ad97edd28d706c566855c051668f424" exitCode=0 Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.119875 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75594866-d94b-4605-a502-77fe533e0407","Type":"ContainerDied","Data":"8c098d680cb2ebcb3d9ac3c370be9cf70ad97edd28d706c566855c051668f424"} Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.314149 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.516466 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-config-data\") pod \"75594866-d94b-4605-a502-77fe533e0407\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.516613 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-combined-ca-bundle\") pod \"75594866-d94b-4605-a502-77fe533e0407\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.516659 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75594866-d94b-4605-a502-77fe533e0407-logs\") pod \"75594866-d94b-4605-a502-77fe533e0407\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.516702 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfsdb\" (UniqueName: \"kubernetes.io/projected/75594866-d94b-4605-a502-77fe533e0407-kube-api-access-gfsdb\") pod \"75594866-d94b-4605-a502-77fe533e0407\" (UID: \"75594866-d94b-4605-a502-77fe533e0407\") " Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.517370 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75594866-d94b-4605-a502-77fe533e0407-logs" (OuterVolumeSpecName: "logs") pod "75594866-d94b-4605-a502-77fe533e0407" (UID: "75594866-d94b-4605-a502-77fe533e0407"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.525973 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75594866-d94b-4605-a502-77fe533e0407-kube-api-access-gfsdb" (OuterVolumeSpecName: "kube-api-access-gfsdb") pod "75594866-d94b-4605-a502-77fe533e0407" (UID: "75594866-d94b-4605-a502-77fe533e0407"). InnerVolumeSpecName "kube-api-access-gfsdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.550288 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-config-data" (OuterVolumeSpecName: "config-data") pod "75594866-d94b-4605-a502-77fe533e0407" (UID: "75594866-d94b-4605-a502-77fe533e0407"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.553953 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75594866-d94b-4605-a502-77fe533e0407" (UID: "75594866-d94b-4605-a502-77fe533e0407"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.619050 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.619430 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75594866-d94b-4605-a502-77fe533e0407-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.619444 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfsdb\" (UniqueName: \"kubernetes.io/projected/75594866-d94b-4605-a502-77fe533e0407-kube-api-access-gfsdb\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:13 crc kubenswrapper[4772]: I0930 17:25:13.619462 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75594866-d94b-4605-a502-77fe533e0407-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.140193 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75594866-d94b-4605-a502-77fe533e0407","Type":"ContainerDied","Data":"a819bd5f606db1d0fe616d121ffd67ab2f434d46b8f40de46339f1007406a228"} Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.140274 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.140537 4772 scope.go:117] "RemoveContainer" containerID="8c098d680cb2ebcb3d9ac3c370be9cf70ad97edd28d706c566855c051668f424" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.167635 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.180428 4772 scope.go:117] "RemoveContainer" containerID="af785359dfac7b973f17fc3b9c0bbafe6b6f06f284c6021695588752450349af" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.196589 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.226162 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:14 crc kubenswrapper[4772]: E0930 17:25:14.226586 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75594866-d94b-4605-a502-77fe533e0407" containerName="nova-api-log" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.226600 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="75594866-d94b-4605-a502-77fe533e0407" containerName="nova-api-log" Sep 30 17:25:14 crc kubenswrapper[4772]: E0930 17:25:14.226609 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75594866-d94b-4605-a502-77fe533e0407" containerName="nova-api-api" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.226617 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="75594866-d94b-4605-a502-77fe533e0407" containerName="nova-api-api" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.226814 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="75594866-d94b-4605-a502-77fe533e0407" containerName="nova-api-api" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.226835 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="75594866-d94b-4605-a502-77fe533e0407" containerName="nova-api-log" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.227880 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.230339 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.234459 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.235418 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.236854 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.333619 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-config-data\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.333673 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwq4d\" (UniqueName: \"kubernetes.io/projected/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-kube-api-access-vwq4d\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.333740 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-public-tls-certs\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.333929 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.334033 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-logs\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.334108 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.435806 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-config-data\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.435860 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwq4d\" (UniqueName: \"kubernetes.io/projected/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-kube-api-access-vwq4d\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.435911 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-public-tls-certs\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.435933 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.435985 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-logs\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.436017 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.437469 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-logs\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.450633 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.450774 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.450841 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-config-data\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.450959 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-public-tls-certs\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.454594 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwq4d\" (UniqueName: \"kubernetes.io/projected/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-kube-api-access-vwq4d\") pod \"nova-api-0\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.546236 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:25:14 crc kubenswrapper[4772]: I0930 17:25:14.780120 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.182:3000/\": dial tcp 10.217.0.182:3000: connect: connection refused" Sep 30 17:25:15 crc kubenswrapper[4772]: I0930 17:25:15.003178 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:15 crc kubenswrapper[4772]: I0930 17:25:15.153547 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c","Type":"ContainerStarted","Data":"39fd31de2a95417ead5f44888f065f942086093415a64484d23553fab6190bdb"} Sep 30 17:25:15 crc kubenswrapper[4772]: I0930 17:25:15.910178 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75594866-d94b-4605-a502-77fe533e0407" path="/var/lib/kubelet/pods/75594866-d94b-4605-a502-77fe533e0407/volumes" Sep 30 17:25:15 crc kubenswrapper[4772]: I0930 17:25:15.972834 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.170239 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9df8v\" (UniqueName: \"kubernetes.io/projected/f43550b7-9e60-4018-99bb-c1ef5c05b022-kube-api-access-9df8v\") pod \"f43550b7-9e60-4018-99bb-c1ef5c05b022\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.170326 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-run-httpd\") pod \"f43550b7-9e60-4018-99bb-c1ef5c05b022\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.170419 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-scripts\") pod \"f43550b7-9e60-4018-99bb-c1ef5c05b022\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.170439 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-combined-ca-bundle\") pod \"f43550b7-9e60-4018-99bb-c1ef5c05b022\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.170596 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-sg-core-conf-yaml\") pod \"f43550b7-9e60-4018-99bb-c1ef5c05b022\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.170622 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-config-data\") pod \"f43550b7-9e60-4018-99bb-c1ef5c05b022\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.170644 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-ceilometer-tls-certs\") pod \"f43550b7-9e60-4018-99bb-c1ef5c05b022\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.170672 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-log-httpd\") pod \"f43550b7-9e60-4018-99bb-c1ef5c05b022\" (UID: \"f43550b7-9e60-4018-99bb-c1ef5c05b022\") " Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.171546 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f43550b7-9e60-4018-99bb-c1ef5c05b022" (UID: "f43550b7-9e60-4018-99bb-c1ef5c05b022"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.172278 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f43550b7-9e60-4018-99bb-c1ef5c05b022" (UID: "f43550b7-9e60-4018-99bb-c1ef5c05b022"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.175852 4772 generic.go:334] "Generic (PLEG): container finished" podID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerID="4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5" exitCode=0 Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.175903 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43550b7-9e60-4018-99bb-c1ef5c05b022","Type":"ContainerDied","Data":"4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5"} Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.175959 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43550b7-9e60-4018-99bb-c1ef5c05b022","Type":"ContainerDied","Data":"776a8745bcd7081f1e83fa71734cc8d1d2d908537ac54ab250c45ae4641a877c"} Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.175964 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.175982 4772 scope.go:117] "RemoveContainer" containerID="b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.177858 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43550b7-9e60-4018-99bb-c1ef5c05b022-kube-api-access-9df8v" (OuterVolumeSpecName: "kube-api-access-9df8v") pod "f43550b7-9e60-4018-99bb-c1ef5c05b022" (UID: "f43550b7-9e60-4018-99bb-c1ef5c05b022"). InnerVolumeSpecName "kube-api-access-9df8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.178938 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-scripts" (OuterVolumeSpecName: "scripts") pod "f43550b7-9e60-4018-99bb-c1ef5c05b022" (UID: "f43550b7-9e60-4018-99bb-c1ef5c05b022"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.179964 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c","Type":"ContainerStarted","Data":"f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7"} Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.180005 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c","Type":"ContainerStarted","Data":"208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291"} Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.205484 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.205456035 podStartE2EDuration="2.205456035s" podCreationTimestamp="2025-09-30 17:25:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:25:16.200960157 +0000 UTC m=+1417.107972998" watchObservedRunningTime="2025-09-30 17:25:16.205456035 +0000 UTC m=+1417.112468866" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.208353 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f43550b7-9e60-4018-99bb-c1ef5c05b022" (UID: "f43550b7-9e60-4018-99bb-c1ef5c05b022"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.240725 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f43550b7-9e60-4018-99bb-c1ef5c05b022" (UID: "f43550b7-9e60-4018-99bb-c1ef5c05b022"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.269403 4772 scope.go:117] "RemoveContainer" containerID="0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.273651 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.273683 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.273694 4772 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.273706 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.273715 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9df8v\" (UniqueName: \"kubernetes.io/projected/f43550b7-9e60-4018-99bb-c1ef5c05b022-kube-api-access-9df8v\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.273724 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43550b7-9e60-4018-99bb-c1ef5c05b022-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.278603 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f43550b7-9e60-4018-99bb-c1ef5c05b022" (UID: "f43550b7-9e60-4018-99bb-c1ef5c05b022"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.291302 4772 scope.go:117] "RemoveContainer" containerID="4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.297806 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-config-data" (OuterVolumeSpecName: "config-data") pod "f43550b7-9e60-4018-99bb-c1ef5c05b022" (UID: "f43550b7-9e60-4018-99bb-c1ef5c05b022"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.313543 4772 scope.go:117] "RemoveContainer" containerID="771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.341278 4772 scope.go:117] "RemoveContainer" containerID="b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0" Sep 30 17:25:16 crc kubenswrapper[4772]: E0930 17:25:16.341914 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0\": container with ID starting with b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0 not found: ID does not exist" containerID="b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.341947 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0"} err="failed to get container status \"b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0\": rpc error: code = NotFound desc = could not find container \"b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0\": container with ID starting with b6a3356f8b9310300fbd69e94812630511bc3113966ce4c105827bf076be97b0 not found: ID does not exist" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.341970 4772 scope.go:117] "RemoveContainer" containerID="0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb" Sep 30 17:25:16 crc kubenswrapper[4772]: E0930 17:25:16.342285 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb\": container with ID starting with 0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb not found: ID does not exist" containerID="0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.342308 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb"} err="failed to get container status \"0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb\": rpc error: code = NotFound desc = could not find container \"0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb\": container with ID starting with 0f5793d865897165a7f1258c7fa4b739400bff834f1b761535a92d0e6dc8eafb not found: ID does not exist" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.342321 4772 scope.go:117] "RemoveContainer" containerID="4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5" Sep 30 17:25:16 crc kubenswrapper[4772]: E0930 17:25:16.342501 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5\": container with ID starting with 4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5 not found: ID does not exist" containerID="4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.342515 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5"} err="failed to get container status \"4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5\": rpc error: code = NotFound desc = could not find container \"4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5\": container with ID starting with 4da68bb7805381b9777c29d87325dcfee7990e2b232e06ddf320c47685a9b0b5 not found: ID does not exist" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.342526 4772 scope.go:117] "RemoveContainer" containerID="771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee" Sep 30 17:25:16 crc kubenswrapper[4772]: E0930 17:25:16.342666 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee\": container with ID starting with 771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee not found: ID does not exist" containerID="771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.342680 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee"} err="failed to get container status \"771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee\": rpc error: code = NotFound desc = could not find container \"771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee\": container with ID starting with 771daac41d3977e1c1e981162d0e4ddec97688d1aabf8a3436745bb9ca47f0ee not found: ID does not exist" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.375363 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.375410 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43550b7-9e60-4018-99bb-c1ef5c05b022-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.534957 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.544907 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.560478 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:25:16 crc kubenswrapper[4772]: E0930 17:25:16.560943 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="ceilometer-notification-agent" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.560964 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="ceilometer-notification-agent" Sep 30 17:25:16 crc kubenswrapper[4772]: E0930 17:25:16.560980 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="ceilometer-central-agent" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.560987 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="ceilometer-central-agent" Sep 30 17:25:16 crc kubenswrapper[4772]: E0930 17:25:16.561021 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="sg-core" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.561027 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="sg-core" Sep 30 17:25:16 crc kubenswrapper[4772]: E0930 17:25:16.561041 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="proxy-httpd" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.561047 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="proxy-httpd" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.561222 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="ceilometer-central-agent" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.561244 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="ceilometer-notification-agent" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.561259 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="proxy-httpd" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.561268 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" containerName="sg-core" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.562933 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.568124 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.601005 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.601160 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.601274 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.604551 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk6vh\" (UniqueName: \"kubernetes.io/projected/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-kube-api-access-lk6vh\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.604605 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-scripts\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.604629 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-log-httpd\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.604647 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-config-data\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.604661 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-run-httpd\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.604696 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.604736 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.604771 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.706495 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.706585 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.706632 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.706731 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk6vh\" (UniqueName: \"kubernetes.io/projected/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-kube-api-access-lk6vh\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.706788 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-scripts\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.706810 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-log-httpd\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.706830 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-config-data\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.706847 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-run-httpd\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.707305 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-run-httpd\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.708400 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-log-httpd\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.712651 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-scripts\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.713393 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.717042 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.718025 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.724783 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-config-data\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.725397 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk6vh\" (UniqueName: \"kubernetes.io/projected/2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5-kube-api-access-lk6vh\") pod \"ceilometer-0\" (UID: \"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5\") " pod="openstack/ceilometer-0" Sep 30 17:25:16 crc kubenswrapper[4772]: I0930 17:25:16.919771 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 17:25:17 crc kubenswrapper[4772]: I0930 17:25:17.402447 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 17:25:17 crc kubenswrapper[4772]: W0930 17:25:17.404944 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b38e508_02ac_4e05_aca3_2eb9e5ccd4b5.slice/crio-03636af19f795829bc10d7579b7b71ac19f1c85e40fc959f50e27c1162b766c5 WatchSource:0}: Error finding container 03636af19f795829bc10d7579b7b71ac19f1c85e40fc959f50e27c1162b766c5: Status 404 returned error can't find the container with id 03636af19f795829bc10d7579b7b71ac19f1c85e40fc959f50e27c1162b766c5 Sep 30 17:25:17 crc kubenswrapper[4772]: I0930 17:25:17.759650 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:17 crc kubenswrapper[4772]: I0930 17:25:17.781426 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:17 crc kubenswrapper[4772]: I0930 17:25:17.921185 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43550b7-9e60-4018-99bb-c1ef5c05b022" path="/var/lib/kubelet/pods/f43550b7-9e60-4018-99bb-c1ef5c05b022/volumes" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.215034 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5","Type":"ContainerStarted","Data":"7628c68a0b08ac7ba03f648bd29c403f12013f16da0c1f5b862b8bf2c2899d95"} Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.215145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5","Type":"ContainerStarted","Data":"f1897483764df06d8ab03a18fd23537e994b4223cc5f8bb16de26a2cd69d1751"} Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.215159 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5","Type":"ContainerStarted","Data":"03636af19f795829bc10d7579b7b71ac19f1c85e40fc959f50e27c1162b766c5"} Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.234200 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.411566 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gwmjj"] Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.413426 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.417288 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.417531 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.432526 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gwmjj"] Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.485405 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-scripts\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.485472 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-config-data\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.485510 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj9v6\" (UniqueName: \"kubernetes.io/projected/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-kube-api-access-gj9v6\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.485756 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.558519 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.591596 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.592427 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-scripts\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.592574 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-config-data\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.592690 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj9v6\" (UniqueName: \"kubernetes.io/projected/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-kube-api-access-gj9v6\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.597245 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-scripts\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.597910 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-config-data\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.605929 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.629512 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-957558b67-rfgbf"] Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.629762 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-957558b67-rfgbf" podUID="9df50dc7-a9cd-4936-8f5b-c469a78679ca" containerName="dnsmasq-dns" containerID="cri-o://3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b" gracePeriod=10 Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.665326 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj9v6\" (UniqueName: \"kubernetes.io/projected/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-kube-api-access-gj9v6\") pod \"nova-cell1-cell-mapping-gwmjj\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:18 crc kubenswrapper[4772]: I0930 17:25:18.738724 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.185848 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.252556 4772 generic.go:334] "Generic (PLEG): container finished" podID="9df50dc7-a9cd-4936-8f5b-c469a78679ca" containerID="3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b" exitCode=0 Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.252664 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-957558b67-rfgbf" event={"ID":"9df50dc7-a9cd-4936-8f5b-c469a78679ca","Type":"ContainerDied","Data":"3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b"} Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.252698 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-957558b67-rfgbf" event={"ID":"9df50dc7-a9cd-4936-8f5b-c469a78679ca","Type":"ContainerDied","Data":"b0de630b3002320f367963a0ee5f8d2e33d1afb803b649c1b9af9a823adc1aea"} Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.252721 4772 scope.go:117] "RemoveContainer" containerID="3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.253003 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-957558b67-rfgbf" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.259176 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5","Type":"ContainerStarted","Data":"4714b36727295302e8b1bd9ab17fdf603d22b3ec37ea6c0c7e517cc2115dedaf"} Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.283294 4772 scope.go:117] "RemoveContainer" containerID="5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.309512 4772 scope.go:117] "RemoveContainer" containerID="3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b" Sep 30 17:25:19 crc kubenswrapper[4772]: E0930 17:25:19.310166 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b\": container with ID starting with 3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b not found: ID does not exist" containerID="3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.310229 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b"} err="failed to get container status \"3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b\": rpc error: code = NotFound desc = could not find container \"3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b\": container with ID starting with 3ffa6c4f6f83c9d5a29cddd328a8ca32bfa4f706df2b4e43477e14b1c8a6223b not found: ID does not exist" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.310278 4772 scope.go:117] "RemoveContainer" containerID="5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd" Sep 30 17:25:19 crc kubenswrapper[4772]: E0930 17:25:19.310543 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd\": container with ID starting with 5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd not found: ID does not exist" containerID="5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.310572 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd"} err="failed to get container status \"5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd\": rpc error: code = NotFound desc = could not find container \"5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd\": container with ID starting with 5fc95506516483d143bb4e65fe7645719d2d43dc310656be057a8e4c2d0cf9fd not found: ID does not exist" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.317389 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-config\") pod \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.317606 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsdnl\" (UniqueName: \"kubernetes.io/projected/9df50dc7-a9cd-4936-8f5b-c469a78679ca-kube-api-access-hsdnl\") pod \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.317686 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-sb\") pod \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.317751 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-dns-svc\") pod \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.317829 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-nb\") pod \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\" (UID: \"9df50dc7-a9cd-4936-8f5b-c469a78679ca\") " Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.327909 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9df50dc7-a9cd-4936-8f5b-c469a78679ca-kube-api-access-hsdnl" (OuterVolumeSpecName: "kube-api-access-hsdnl") pod "9df50dc7-a9cd-4936-8f5b-c469a78679ca" (UID: "9df50dc7-a9cd-4936-8f5b-c469a78679ca"). InnerVolumeSpecName "kube-api-access-hsdnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.387566 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gwmjj"] Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.397830 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9df50dc7-a9cd-4936-8f5b-c469a78679ca" (UID: "9df50dc7-a9cd-4936-8f5b-c469a78679ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.399901 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9df50dc7-a9cd-4936-8f5b-c469a78679ca" (UID: "9df50dc7-a9cd-4936-8f5b-c469a78679ca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.404081 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-config" (OuterVolumeSpecName: "config") pod "9df50dc7-a9cd-4936-8f5b-c469a78679ca" (UID: "9df50dc7-a9cd-4936-8f5b-c469a78679ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.420187 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.422105 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsdnl\" (UniqueName: \"kubernetes.io/projected/9df50dc7-a9cd-4936-8f5b-c469a78679ca-kube-api-access-hsdnl\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.422203 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.422290 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.438963 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9df50dc7-a9cd-4936-8f5b-c469a78679ca" (UID: "9df50dc7-a9cd-4936-8f5b-c469a78679ca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.524104 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9df50dc7-a9cd-4936-8f5b-c469a78679ca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.638616 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-957558b67-rfgbf"] Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.648564 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-957558b67-rfgbf"] Sep 30 17:25:19 crc kubenswrapper[4772]: I0930 17:25:19.911867 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9df50dc7-a9cd-4936-8f5b-c469a78679ca" path="/var/lib/kubelet/pods/9df50dc7-a9cd-4936-8f5b-c469a78679ca/volumes" Sep 30 17:25:20 crc kubenswrapper[4772]: I0930 17:25:20.272198 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gwmjj" event={"ID":"4d4e6bec-dedc-4022-a6e2-d615ec89a5db","Type":"ContainerStarted","Data":"a4dc299d7e32e1abf9062711f58d5b154593c40e3440f02cc0dacc20b5b7f014"} Sep 30 17:25:20 crc kubenswrapper[4772]: I0930 17:25:20.272518 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gwmjj" event={"ID":"4d4e6bec-dedc-4022-a6e2-d615ec89a5db","Type":"ContainerStarted","Data":"f99bd362200c2187278a9a7e6c5c0875600330ef837d8378aeb700b429abeb00"} Sep 30 17:25:20 crc kubenswrapper[4772]: I0930 17:25:20.294275 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gwmjj" podStartSLOduration=2.294249424 podStartE2EDuration="2.294249424s" podCreationTimestamp="2025-09-30 17:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:25:20.286387277 +0000 UTC m=+1421.193400108" watchObservedRunningTime="2025-09-30 17:25:20.294249424 +0000 UTC m=+1421.201262255" Sep 30 17:25:21 crc kubenswrapper[4772]: I0930 17:25:21.285082 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5","Type":"ContainerStarted","Data":"c2e65ad248648729bd45bd30400b09cf2c1c1b26a4285f401437b18af2623fb0"} Sep 30 17:25:21 crc kubenswrapper[4772]: I0930 17:25:21.313047 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.383026676 podStartE2EDuration="5.313013901s" podCreationTimestamp="2025-09-30 17:25:16 +0000 UTC" firstStartedPulling="2025-09-30 17:25:17.407162625 +0000 UTC m=+1418.314175456" lastFinishedPulling="2025-09-30 17:25:20.33714985 +0000 UTC m=+1421.244162681" observedRunningTime="2025-09-30 17:25:21.304567029 +0000 UTC m=+1422.211579860" watchObservedRunningTime="2025-09-30 17:25:21.313013901 +0000 UTC m=+1422.220026732" Sep 30 17:25:22 crc kubenswrapper[4772]: I0930 17:25:22.293862 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 17:25:24 crc kubenswrapper[4772]: I0930 17:25:24.547229 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:25:24 crc kubenswrapper[4772]: I0930 17:25:24.547518 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:25:25 crc kubenswrapper[4772]: I0930 17:25:25.341799 4772 generic.go:334] "Generic (PLEG): container finished" podID="4d4e6bec-dedc-4022-a6e2-d615ec89a5db" containerID="a4dc299d7e32e1abf9062711f58d5b154593c40e3440f02cc0dacc20b5b7f014" exitCode=0 Sep 30 17:25:25 crc kubenswrapper[4772]: I0930 17:25:25.342448 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gwmjj" event={"ID":"4d4e6bec-dedc-4022-a6e2-d615ec89a5db","Type":"ContainerDied","Data":"a4dc299d7e32e1abf9062711f58d5b154593c40e3440f02cc0dacc20b5b7f014"} Sep 30 17:25:25 crc kubenswrapper[4772]: I0930 17:25:25.564196 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:25:25 crc kubenswrapper[4772]: I0930 17:25:25.564204 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.745580 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.870317 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-combined-ca-bundle\") pod \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.870416 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-config-data\") pod \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.870673 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj9v6\" (UniqueName: \"kubernetes.io/projected/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-kube-api-access-gj9v6\") pod \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.870703 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-scripts\") pod \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\" (UID: \"4d4e6bec-dedc-4022-a6e2-d615ec89a5db\") " Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.876952 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-kube-api-access-gj9v6" (OuterVolumeSpecName: "kube-api-access-gj9v6") pod "4d4e6bec-dedc-4022-a6e2-d615ec89a5db" (UID: "4d4e6bec-dedc-4022-a6e2-d615ec89a5db"). InnerVolumeSpecName "kube-api-access-gj9v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.887946 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-scripts" (OuterVolumeSpecName: "scripts") pod "4d4e6bec-dedc-4022-a6e2-d615ec89a5db" (UID: "4d4e6bec-dedc-4022-a6e2-d615ec89a5db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.901169 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d4e6bec-dedc-4022-a6e2-d615ec89a5db" (UID: "4d4e6bec-dedc-4022-a6e2-d615ec89a5db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.908683 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-config-data" (OuterVolumeSpecName: "config-data") pod "4d4e6bec-dedc-4022-a6e2-d615ec89a5db" (UID: "4d4e6bec-dedc-4022-a6e2-d615ec89a5db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.972574 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj9v6\" (UniqueName: \"kubernetes.io/projected/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-kube-api-access-gj9v6\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.972615 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.972629 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:26 crc kubenswrapper[4772]: I0930 17:25:26.972640 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4e6bec-dedc-4022-a6e2-d615ec89a5db-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.367625 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gwmjj" event={"ID":"4d4e6bec-dedc-4022-a6e2-d615ec89a5db","Type":"ContainerDied","Data":"f99bd362200c2187278a9a7e6c5c0875600330ef837d8378aeb700b429abeb00"} Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.368018 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f99bd362200c2187278a9a7e6c5c0875600330ef837d8378aeb700b429abeb00" Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.367843 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gwmjj" Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.536249 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.537215 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerName="nova-api-api" containerID="cri-o://f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7" gracePeriod=30 Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.537444 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerName="nova-api-log" containerID="cri-o://208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291" gracePeriod=30 Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.572486 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.572743 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="221815bf-6b6e-4241-8dc2-6591acff3e68" containerName="nova-scheduler-scheduler" containerID="cri-o://0a751948491b112a21ff285cb9e546fcdbfc7eee8abf0d51c5f63569c85e98bc" gracePeriod=30 Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.603461 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.603712 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerName="nova-metadata-log" containerID="cri-o://ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4" gracePeriod=30 Sep 30 17:25:27 crc kubenswrapper[4772]: I0930 17:25:27.603973 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerName="nova-metadata-metadata" containerID="cri-o://8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d" gracePeriod=30 Sep 30 17:25:28 crc kubenswrapper[4772]: I0930 17:25:28.408249 4772 generic.go:334] "Generic (PLEG): container finished" podID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerID="ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4" exitCode=143 Sep 30 17:25:28 crc kubenswrapper[4772]: I0930 17:25:28.408383 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f7babc7-57a9-4eed-a69d-75498c70f2d9","Type":"ContainerDied","Data":"ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4"} Sep 30 17:25:28 crc kubenswrapper[4772]: I0930 17:25:28.412981 4772 generic.go:334] "Generic (PLEG): container finished" podID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerID="208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291" exitCode=143 Sep 30 17:25:28 crc kubenswrapper[4772]: I0930 17:25:28.413030 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c","Type":"ContainerDied","Data":"208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291"} Sep 30 17:25:28 crc kubenswrapper[4772]: I0930 17:25:28.935438 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.019760 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-combined-ca-bundle\") pod \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.019857 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7babc7-57a9-4eed-a69d-75498c70f2d9-logs\") pod \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.019981 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbl5q\" (UniqueName: \"kubernetes.io/projected/6f7babc7-57a9-4eed-a69d-75498c70f2d9-kube-api-access-wbl5q\") pod \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.020105 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-nova-metadata-tls-certs\") pod \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.020121 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-config-data\") pod \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\" (UID: \"6f7babc7-57a9-4eed-a69d-75498c70f2d9\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.021462 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f7babc7-57a9-4eed-a69d-75498c70f2d9-logs" (OuterVolumeSpecName: "logs") pod "6f7babc7-57a9-4eed-a69d-75498c70f2d9" (UID: "6f7babc7-57a9-4eed-a69d-75498c70f2d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.045534 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f7babc7-57a9-4eed-a69d-75498c70f2d9-kube-api-access-wbl5q" (OuterVolumeSpecName: "kube-api-access-wbl5q") pod "6f7babc7-57a9-4eed-a69d-75498c70f2d9" (UID: "6f7babc7-57a9-4eed-a69d-75498c70f2d9"). InnerVolumeSpecName "kube-api-access-wbl5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.055195 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-config-data" (OuterVolumeSpecName: "config-data") pod "6f7babc7-57a9-4eed-a69d-75498c70f2d9" (UID: "6f7babc7-57a9-4eed-a69d-75498c70f2d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.062821 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f7babc7-57a9-4eed-a69d-75498c70f2d9" (UID: "6f7babc7-57a9-4eed-a69d-75498c70f2d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.114229 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6f7babc7-57a9-4eed-a69d-75498c70f2d9" (UID: "6f7babc7-57a9-4eed-a69d-75498c70f2d9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.123588 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbl5q\" (UniqueName: \"kubernetes.io/projected/6f7babc7-57a9-4eed-a69d-75498c70f2d9-kube-api-access-wbl5q\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.123625 4772 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.123639 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.123650 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7babc7-57a9-4eed-a69d-75498c70f2d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.123661 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7babc7-57a9-4eed-a69d-75498c70f2d9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.229047 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a751948491b112a21ff285cb9e546fcdbfc7eee8abf0d51c5f63569c85e98bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.230888 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a751948491b112a21ff285cb9e546fcdbfc7eee8abf0d51c5f63569c85e98bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.233385 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a751948491b112a21ff285cb9e546fcdbfc7eee8abf0d51c5f63569c85e98bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.233427 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="221815bf-6b6e-4241-8dc2-6591acff3e68" containerName="nova-scheduler-scheduler" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.426775 4772 generic.go:334] "Generic (PLEG): container finished" podID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerID="8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d" exitCode=0 Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.426838 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f7babc7-57a9-4eed-a69d-75498c70f2d9","Type":"ContainerDied","Data":"8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d"} Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.426925 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f7babc7-57a9-4eed-a69d-75498c70f2d9","Type":"ContainerDied","Data":"3408e19e00dd6471ee533f0732d295738b8ebb8ac4ebfe8b48db8c77b18be461"} Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.426947 4772 scope.go:117] "RemoveContainer" containerID="8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.426864 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.516476 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.519582 4772 scope.go:117] "RemoveContainer" containerID="ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.528290 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.536388 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.536826 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df50dc7-a9cd-4936-8f5b-c469a78679ca" containerName="dnsmasq-dns" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.536843 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df50dc7-a9cd-4936-8f5b-c469a78679ca" containerName="dnsmasq-dns" Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.536872 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4e6bec-dedc-4022-a6e2-d615ec89a5db" containerName="nova-manage" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.536883 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4e6bec-dedc-4022-a6e2-d615ec89a5db" containerName="nova-manage" Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.536895 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df50dc7-a9cd-4936-8f5b-c469a78679ca" containerName="init" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.536902 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df50dc7-a9cd-4936-8f5b-c469a78679ca" containerName="init" Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.536926 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerName="nova-metadata-log" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.536935 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerName="nova-metadata-log" Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.536955 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerName="nova-metadata-metadata" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.536966 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerName="nova-metadata-metadata" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.537190 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9df50dc7-a9cd-4936-8f5b-c469a78679ca" containerName="dnsmasq-dns" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.537219 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerName="nova-metadata-log" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.537234 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4e6bec-dedc-4022-a6e2-d615ec89a5db" containerName="nova-manage" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.537247 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" containerName="nova-metadata-metadata" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.538496 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.543608 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.545680 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.550961 4772 scope.go:117] "RemoveContainer" containerID="8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d" Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.552115 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d\": container with ID starting with 8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d not found: ID does not exist" containerID="8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.552167 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d"} err="failed to get container status \"8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d\": rpc error: code = NotFound desc = could not find container \"8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d\": container with ID starting with 8cae5b375156d402f8c13c7ec532a440412c1672b6ca078b1738eef17866894d not found: ID does not exist" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.552203 4772 scope.go:117] "RemoveContainer" containerID="ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4" Sep 30 17:25:29 crc kubenswrapper[4772]: E0930 17:25:29.552554 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4\": container with ID starting with ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4 not found: ID does not exist" containerID="ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.552584 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4"} err="failed to get container status \"ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4\": rpc error: code = NotFound desc = could not find container \"ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4\": container with ID starting with ef887ca2cb5e37182d8193bc8e0bc64209ac0aa68f59a3c3481ab200d98200c4 not found: ID does not exist" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.558325 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.633448 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc608be5-335a-4080-9a63-9266b733dde3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.633543 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc608be5-335a-4080-9a63-9266b733dde3-logs\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.633634 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc608be5-335a-4080-9a63-9266b733dde3-config-data\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.633701 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc608be5-335a-4080-9a63-9266b733dde3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.633732 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwrc\" (UniqueName: \"kubernetes.io/projected/dc608be5-335a-4080-9a63-9266b733dde3-kube-api-access-6pwrc\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.736970 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc608be5-335a-4080-9a63-9266b733dde3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.737338 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwrc\" (UniqueName: \"kubernetes.io/projected/dc608be5-335a-4080-9a63-9266b733dde3-kube-api-access-6pwrc\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.737468 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc608be5-335a-4080-9a63-9266b733dde3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.737513 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc608be5-335a-4080-9a63-9266b733dde3-logs\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.737546 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc608be5-335a-4080-9a63-9266b733dde3-config-data\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.738409 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc608be5-335a-4080-9a63-9266b733dde3-logs\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.742875 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc608be5-335a-4080-9a63-9266b733dde3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.742875 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc608be5-335a-4080-9a63-9266b733dde3-config-data\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.743453 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc608be5-335a-4080-9a63-9266b733dde3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.756127 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwrc\" (UniqueName: \"kubernetes.io/projected/dc608be5-335a-4080-9a63-9266b733dde3-kube-api-access-6pwrc\") pod \"nova-metadata-0\" (UID: \"dc608be5-335a-4080-9a63-9266b733dde3\") " pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.775660 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.839588 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-logs\") pod \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.839701 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwq4d\" (UniqueName: \"kubernetes.io/projected/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-kube-api-access-vwq4d\") pod \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.839742 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-public-tls-certs\") pod \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.839810 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-internal-tls-certs\") pod \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.839878 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-combined-ca-bundle\") pod \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.840010 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-config-data\") pod \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\" (UID: \"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c\") " Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.840767 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-logs" (OuterVolumeSpecName: "logs") pod "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" (UID: "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.845541 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-kube-api-access-vwq4d" (OuterVolumeSpecName: "kube-api-access-vwq4d") pod "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" (UID: "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c"). InnerVolumeSpecName "kube-api-access-vwq4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.858607 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.868870 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-config-data" (OuterVolumeSpecName: "config-data") pod "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" (UID: "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.875087 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" (UID: "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.897348 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" (UID: "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.912803 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f7babc7-57a9-4eed-a69d-75498c70f2d9" path="/var/lib/kubelet/pods/6f7babc7-57a9-4eed-a69d-75498c70f2d9/volumes" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.929690 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" (UID: "97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.943780 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.943808 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwq4d\" (UniqueName: \"kubernetes.io/projected/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-kube-api-access-vwq4d\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.943818 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.943827 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.943836 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:29 crc kubenswrapper[4772]: I0930 17:25:29.943846 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:30 crc kubenswrapper[4772]: W0930 17:25:30.286733 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc608be5_335a_4080_9a63_9266b733dde3.slice/crio-383c9c97d4a76de8708345391ddc781c92152ce41c73b352de6a8516592767fa WatchSource:0}: Error finding container 383c9c97d4a76de8708345391ddc781c92152ce41c73b352de6a8516592767fa: Status 404 returned error can't find the container with id 383c9c97d4a76de8708345391ddc781c92152ce41c73b352de6a8516592767fa Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.287991 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.437661 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.437654 4772 generic.go:334] "Generic (PLEG): container finished" podID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerID="f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7" exitCode=0 Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.437710 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c","Type":"ContainerDied","Data":"f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7"} Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.437758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c","Type":"ContainerDied","Data":"39fd31de2a95417ead5f44888f065f942086093415a64484d23553fab6190bdb"} Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.437775 4772 scope.go:117] "RemoveContainer" containerID="f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.440245 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc608be5-335a-4080-9a63-9266b733dde3","Type":"ContainerStarted","Data":"383c9c97d4a76de8708345391ddc781c92152ce41c73b352de6a8516592767fa"} Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.470541 4772 scope.go:117] "RemoveContainer" containerID="208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.472575 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.488384 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.503938 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:30 crc kubenswrapper[4772]: E0930 17:25:30.504412 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerName="nova-api-api" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.504430 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerName="nova-api-api" Sep 30 17:25:30 crc kubenswrapper[4772]: E0930 17:25:30.504464 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerName="nova-api-log" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.504471 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerName="nova-api-log" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.504634 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerName="nova-api-log" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.504655 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" containerName="nova-api-api" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.509702 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.516584 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.516783 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.520148 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.528926 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.529357 4772 scope.go:117] "RemoveContainer" containerID="f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7" Sep 30 17:25:30 crc kubenswrapper[4772]: E0930 17:25:30.538648 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7\": container with ID starting with f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7 not found: ID does not exist" containerID="f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.538690 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7"} err="failed to get container status \"f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7\": rpc error: code = NotFound desc = could not find container \"f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7\": container with ID starting with f10e5957a23af49256fd86a40bc75c1da382bdc8ea1bf395dd738a7b1ab31dc7 not found: ID does not exist" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.538715 4772 scope.go:117] "RemoveContainer" containerID="208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291" Sep 30 17:25:30 crc kubenswrapper[4772]: E0930 17:25:30.541217 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291\": container with ID starting with 208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291 not found: ID does not exist" containerID="208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.541240 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291"} err="failed to get container status \"208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291\": rpc error: code = NotFound desc = could not find container \"208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291\": container with ID starting with 208d08bbb28fb97328f63699249bf4b068ff1eaa19d501f240b049bf7dc88291 not found: ID does not exist" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.588085 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-config-data\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.588136 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.588186 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19205b6f-4fbc-4114-809f-3f105f8469bb-logs\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.588226 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpzwj\" (UniqueName: \"kubernetes.io/projected/19205b6f-4fbc-4114-809f-3f105f8469bb-kube-api-access-wpzwj\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.588507 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-public-tls-certs\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.588715 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.690899 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-public-tls-certs\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.692031 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.692112 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-config-data\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.692144 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.692211 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19205b6f-4fbc-4114-809f-3f105f8469bb-logs\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.692273 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpzwj\" (UniqueName: \"kubernetes.io/projected/19205b6f-4fbc-4114-809f-3f105f8469bb-kube-api-access-wpzwj\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.693017 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19205b6f-4fbc-4114-809f-3f105f8469bb-logs\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.695947 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-config-data\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.695993 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.696694 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.697223 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19205b6f-4fbc-4114-809f-3f105f8469bb-public-tls-certs\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.709378 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpzwj\" (UniqueName: \"kubernetes.io/projected/19205b6f-4fbc-4114-809f-3f105f8469bb-kube-api-access-wpzwj\") pod \"nova-api-0\" (UID: \"19205b6f-4fbc-4114-809f-3f105f8469bb\") " pod="openstack/nova-api-0" Sep 30 17:25:30 crc kubenswrapper[4772]: I0930 17:25:30.839128 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:25:31 crc kubenswrapper[4772]: I0930 17:25:31.320539 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:25:31 crc kubenswrapper[4772]: I0930 17:25:31.453699 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc608be5-335a-4080-9a63-9266b733dde3","Type":"ContainerStarted","Data":"80dad4e3e1d4948f8339990883c3ffbb42d19e9a49eacafd75151b8293d1b877"} Sep 30 17:25:31 crc kubenswrapper[4772]: I0930 17:25:31.453831 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc608be5-335a-4080-9a63-9266b733dde3","Type":"ContainerStarted","Data":"db3dfd276be03e206b96f4826c6f5234dbd80e1cff656f4e2f1e22d3c63c5996"} Sep 30 17:25:31 crc kubenswrapper[4772]: I0930 17:25:31.459280 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"19205b6f-4fbc-4114-809f-3f105f8469bb","Type":"ContainerStarted","Data":"bacd95d259ac61c0ce76fac38095367f3a8fb2d84b67bd556ae1eaed9d253978"} Sep 30 17:25:31 crc kubenswrapper[4772]: I0930 17:25:31.481115 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.481097537 podStartE2EDuration="2.481097537s" podCreationTimestamp="2025-09-30 17:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:25:31.477301066 +0000 UTC m=+1432.384313897" watchObservedRunningTime="2025-09-30 17:25:31.481097537 +0000 UTC m=+1432.388110368" Sep 30 17:25:31 crc kubenswrapper[4772]: I0930 17:25:31.909477 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c" path="/var/lib/kubelet/pods/97fbd0fa-9eb5-4ec2-ae83-1b6a2e26220c/volumes" Sep 30 17:25:32 crc kubenswrapper[4772]: I0930 17:25:32.470598 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"19205b6f-4fbc-4114-809f-3f105f8469bb","Type":"ContainerStarted","Data":"22d66b66d0ffa055bd6d7a8a910622b309574c5d4e51c40ec432efe43c6d83cd"} Sep 30 17:25:32 crc kubenswrapper[4772]: I0930 17:25:32.470934 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"19205b6f-4fbc-4114-809f-3f105f8469bb","Type":"ContainerStarted","Data":"20141af2e50639f03cbc0a850e5fd557c4c6b26f9594c15bac5763595705c884"} Sep 30 17:25:32 crc kubenswrapper[4772]: I0930 17:25:32.495435 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.495408425 podStartE2EDuration="2.495408425s" podCreationTimestamp="2025-09-30 17:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:25:32.492371834 +0000 UTC m=+1433.399384675" watchObservedRunningTime="2025-09-30 17:25:32.495408425 +0000 UTC m=+1433.402421256" Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.486297 4772 generic.go:334] "Generic (PLEG): container finished" podID="221815bf-6b6e-4241-8dc2-6591acff3e68" containerID="0a751948491b112a21ff285cb9e546fcdbfc7eee8abf0d51c5f63569c85e98bc" exitCode=0 Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.486399 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"221815bf-6b6e-4241-8dc2-6591acff3e68","Type":"ContainerDied","Data":"0a751948491b112a21ff285cb9e546fcdbfc7eee8abf0d51c5f63569c85e98bc"} Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.641288 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.774484 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-combined-ca-bundle\") pod \"221815bf-6b6e-4241-8dc2-6591acff3e68\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.774825 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-config-data\") pod \"221815bf-6b6e-4241-8dc2-6591acff3e68\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.775840 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf94h\" (UniqueName: \"kubernetes.io/projected/221815bf-6b6e-4241-8dc2-6591acff3e68-kube-api-access-rf94h\") pod \"221815bf-6b6e-4241-8dc2-6591acff3e68\" (UID: \"221815bf-6b6e-4241-8dc2-6591acff3e68\") " Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.780967 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221815bf-6b6e-4241-8dc2-6591acff3e68-kube-api-access-rf94h" (OuterVolumeSpecName: "kube-api-access-rf94h") pod "221815bf-6b6e-4241-8dc2-6591acff3e68" (UID: "221815bf-6b6e-4241-8dc2-6591acff3e68"). InnerVolumeSpecName "kube-api-access-rf94h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.806122 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-config-data" (OuterVolumeSpecName: "config-data") pod "221815bf-6b6e-4241-8dc2-6591acff3e68" (UID: "221815bf-6b6e-4241-8dc2-6591acff3e68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.808920 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "221815bf-6b6e-4241-8dc2-6591acff3e68" (UID: "221815bf-6b6e-4241-8dc2-6591acff3e68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.879834 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.879871 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221815bf-6b6e-4241-8dc2-6591acff3e68-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:33 crc kubenswrapper[4772]: I0930 17:25:33.879880 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf94h\" (UniqueName: \"kubernetes.io/projected/221815bf-6b6e-4241-8dc2-6591acff3e68-kube-api-access-rf94h\") on node \"crc\" DevicePath \"\"" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.498142 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"221815bf-6b6e-4241-8dc2-6591acff3e68","Type":"ContainerDied","Data":"69318a05ebf5d504e25f212c9c750c38ba024c0c8a532f4731eb45c5e6ca6712"} Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.498208 4772 scope.go:117] "RemoveContainer" containerID="0a751948491b112a21ff285cb9e546fcdbfc7eee8abf0d51c5f63569c85e98bc" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.498215 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.527766 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.537875 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.548578 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:25:34 crc kubenswrapper[4772]: E0930 17:25:34.549358 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221815bf-6b6e-4241-8dc2-6591acff3e68" containerName="nova-scheduler-scheduler" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.549378 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="221815bf-6b6e-4241-8dc2-6591acff3e68" containerName="nova-scheduler-scheduler" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.549581 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="221815bf-6b6e-4241-8dc2-6591acff3e68" containerName="nova-scheduler-scheduler" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.550363 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.555782 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.557199 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.695360 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b22tq\" (UniqueName: \"kubernetes.io/projected/072e2a3c-9da9-4b3d-ab28-05338d20eb88-kube-api-access-b22tq\") pod \"nova-scheduler-0\" (UID: \"072e2a3c-9da9-4b3d-ab28-05338d20eb88\") " pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.695481 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072e2a3c-9da9-4b3d-ab28-05338d20eb88-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"072e2a3c-9da9-4b3d-ab28-05338d20eb88\") " pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.695603 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072e2a3c-9da9-4b3d-ab28-05338d20eb88-config-data\") pod \"nova-scheduler-0\" (UID: \"072e2a3c-9da9-4b3d-ab28-05338d20eb88\") " pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.797957 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072e2a3c-9da9-4b3d-ab28-05338d20eb88-config-data\") pod \"nova-scheduler-0\" (UID: \"072e2a3c-9da9-4b3d-ab28-05338d20eb88\") " pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.799185 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b22tq\" (UniqueName: \"kubernetes.io/projected/072e2a3c-9da9-4b3d-ab28-05338d20eb88-kube-api-access-b22tq\") pod \"nova-scheduler-0\" (UID: \"072e2a3c-9da9-4b3d-ab28-05338d20eb88\") " pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.799223 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072e2a3c-9da9-4b3d-ab28-05338d20eb88-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"072e2a3c-9da9-4b3d-ab28-05338d20eb88\") " pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.802700 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072e2a3c-9da9-4b3d-ab28-05338d20eb88-config-data\") pod \"nova-scheduler-0\" (UID: \"072e2a3c-9da9-4b3d-ab28-05338d20eb88\") " pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.803660 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072e2a3c-9da9-4b3d-ab28-05338d20eb88-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"072e2a3c-9da9-4b3d-ab28-05338d20eb88\") " pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.815940 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b22tq\" (UniqueName: \"kubernetes.io/projected/072e2a3c-9da9-4b3d-ab28-05338d20eb88-kube-api-access-b22tq\") pod \"nova-scheduler-0\" (UID: \"072e2a3c-9da9-4b3d-ab28-05338d20eb88\") " pod="openstack/nova-scheduler-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.859353 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.860422 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 17:25:34 crc kubenswrapper[4772]: I0930 17:25:34.870562 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:25:35 crc kubenswrapper[4772]: W0930 17:25:35.320802 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod072e2a3c_9da9_4b3d_ab28_05338d20eb88.slice/crio-b7e0f5918e55433b9f244d57bbe9a4ef26edbe3bc7fdef73c825a6859fd2e25f WatchSource:0}: Error finding container b7e0f5918e55433b9f244d57bbe9a4ef26edbe3bc7fdef73c825a6859fd2e25f: Status 404 returned error can't find the container with id b7e0f5918e55433b9f244d57bbe9a4ef26edbe3bc7fdef73c825a6859fd2e25f Sep 30 17:25:35 crc kubenswrapper[4772]: I0930 17:25:35.348972 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:25:35 crc kubenswrapper[4772]: I0930 17:25:35.510808 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"072e2a3c-9da9-4b3d-ab28-05338d20eb88","Type":"ContainerStarted","Data":"b7e0f5918e55433b9f244d57bbe9a4ef26edbe3bc7fdef73c825a6859fd2e25f"} Sep 30 17:25:35 crc kubenswrapper[4772]: I0930 17:25:35.908652 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221815bf-6b6e-4241-8dc2-6591acff3e68" path="/var/lib/kubelet/pods/221815bf-6b6e-4241-8dc2-6591acff3e68/volumes" Sep 30 17:25:36 crc kubenswrapper[4772]: I0930 17:25:36.521774 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"072e2a3c-9da9-4b3d-ab28-05338d20eb88","Type":"ContainerStarted","Data":"18d64dd3860268fbc090c7340865e019b7defcf876956b63def06030992be4b4"} Sep 30 17:25:36 crc kubenswrapper[4772]: I0930 17:25:36.543792 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.543773708 podStartE2EDuration="2.543773708s" podCreationTimestamp="2025-09-30 17:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:25:36.535797736 +0000 UTC m=+1437.442810567" watchObservedRunningTime="2025-09-30 17:25:36.543773708 +0000 UTC m=+1437.450786539" Sep 30 17:25:39 crc kubenswrapper[4772]: I0930 17:25:39.859787 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 17:25:39 crc kubenswrapper[4772]: I0930 17:25:39.860369 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 17:25:39 crc kubenswrapper[4772]: I0930 17:25:39.870895 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 17:25:40 crc kubenswrapper[4772]: I0930 17:25:40.839887 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:25:40 crc kubenswrapper[4772]: I0930 17:25:40.839953 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:25:40 crc kubenswrapper[4772]: I0930 17:25:40.874353 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dc608be5-335a-4080-9a63-9266b733dde3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:25:40 crc kubenswrapper[4772]: I0930 17:25:40.874754 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dc608be5-335a-4080-9a63-9266b733dde3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:25:41 crc kubenswrapper[4772]: I0930 17:25:41.853474 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="19205b6f-4fbc-4114-809f-3f105f8469bb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:25:41 crc kubenswrapper[4772]: I0930 17:25:41.853527 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="19205b6f-4fbc-4114-809f-3f105f8469bb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:25:44 crc kubenswrapper[4772]: I0930 17:25:44.871531 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 17:25:44 crc kubenswrapper[4772]: I0930 17:25:44.897045 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 17:25:45 crc kubenswrapper[4772]: I0930 17:25:45.653928 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 17:25:46 crc kubenswrapper[4772]: I0930 17:25:46.932146 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 17:25:49 crc kubenswrapper[4772]: I0930 17:25:49.865569 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 17:25:49 crc kubenswrapper[4772]: I0930 17:25:49.866663 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 17:25:49 crc kubenswrapper[4772]: I0930 17:25:49.874274 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 17:25:49 crc kubenswrapper[4772]: I0930 17:25:49.878404 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 17:25:50 crc kubenswrapper[4772]: I0930 17:25:50.847940 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 17:25:50 crc kubenswrapper[4772]: I0930 17:25:50.848527 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 17:25:50 crc kubenswrapper[4772]: I0930 17:25:50.848986 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 17:25:50 crc kubenswrapper[4772]: I0930 17:25:50.862453 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 17:25:51 crc kubenswrapper[4772]: I0930 17:25:51.706937 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 17:25:51 crc kubenswrapper[4772]: I0930 17:25:51.721338 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 17:25:59 crc kubenswrapper[4772]: I0930 17:25:59.750014 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:26:01 crc kubenswrapper[4772]: I0930 17:26:01.157737 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:26:02 crc kubenswrapper[4772]: I0930 17:26:02.916458 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerName="rabbitmq" containerID="cri-o://873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543" gracePeriod=604797 Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.241994 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" containerName="rabbitmq" containerID="cri-o://ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f" gracePeriod=604797 Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.599574 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.649160 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kjcrd"] Sep 30 17:26:04 crc kubenswrapper[4772]: E0930 17:26:04.649826 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerName="rabbitmq" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.649857 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerName="rabbitmq" Sep 30 17:26:04 crc kubenswrapper[4772]: E0930 17:26:04.649877 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerName="setup-container" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.649887 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerName="setup-container" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.650153 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerName="rabbitmq" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.652191 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667268 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0788e86-24b4-421d-98c9-12f0a8e52740-pod-info\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667348 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-erlang-cookie\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667378 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-tls\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667423 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-plugins\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667450 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667491 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-942vw\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-kube-api-access-942vw\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667609 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0788e86-24b4-421d-98c9-12f0a8e52740-erlang-cookie-secret\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667643 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-plugins-conf\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667781 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-confd\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667813 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-server-conf\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.667891 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-config-data\") pod \"c0788e86-24b4-421d-98c9-12f0a8e52740\" (UID: \"c0788e86-24b4-421d-98c9-12f0a8e52740\") " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.669811 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.669995 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.691486 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.698358 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c0788e86-24b4-421d-98c9-12f0a8e52740-pod-info" (OuterVolumeSpecName: "pod-info") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.698566 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.749993 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-kube-api-access-942vw" (OuterVolumeSpecName: "kube-api-access-942vw") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "kube-api-access-942vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.755410 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.701681 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0788e86-24b4-421d-98c9-12f0a8e52740-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.766706 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kjcrd"] Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.800384 4772 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0788e86-24b4-421d-98c9-12f0a8e52740-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.800470 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.800566 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.803702 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.803784 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.803812 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-942vw\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-kube-api-access-942vw\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.803832 4772 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0788e86-24b4-421d-98c9-12f0a8e52740-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.803851 4772 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.912999 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.937164 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-config-data" (OuterVolumeSpecName: "config-data") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.947205 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-catalog-content\") pod \"certified-operators-kjcrd\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.947475 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwxfj\" (UniqueName: \"kubernetes.io/projected/5a6064c6-a8b2-4901-ba91-344249cfe582-kube-api-access-fwxfj\") pod \"certified-operators-kjcrd\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.947577 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-utilities\") pod \"certified-operators-kjcrd\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.947801 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.947821 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.968104 4772 generic.go:334] "Generic (PLEG): container finished" podID="c0788e86-24b4-421d-98c9-12f0a8e52740" containerID="873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543" exitCode=0 Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.968167 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c0788e86-24b4-421d-98c9-12f0a8e52740","Type":"ContainerDied","Data":"873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543"} Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.968208 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c0788e86-24b4-421d-98c9-12f0a8e52740","Type":"ContainerDied","Data":"d5d75b166cc3d246f3ca13757503f1c51a46835d787fa2f1696806213ba9a783"} Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.968232 4772 scope.go:117] "RemoveContainer" containerID="873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.968478 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 17:26:04 crc kubenswrapper[4772]: I0930 17:26:04.977987 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-server-conf" (OuterVolumeSpecName: "server-conf") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.014093 4772 scope.go:117] "RemoveContainer" containerID="901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.051235 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-utilities\") pod \"certified-operators-kjcrd\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.051342 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-catalog-content\") pod \"certified-operators-kjcrd\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.051419 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwxfj\" (UniqueName: \"kubernetes.io/projected/5a6064c6-a8b2-4901-ba91-344249cfe582-kube-api-access-fwxfj\") pod \"certified-operators-kjcrd\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.051504 4772 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0788e86-24b4-421d-98c9-12f0a8e52740-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.052880 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-utilities\") pod \"certified-operators-kjcrd\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.053117 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-catalog-content\") pod \"certified-operators-kjcrd\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.081808 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwxfj\" (UniqueName: \"kubernetes.io/projected/5a6064c6-a8b2-4901-ba91-344249cfe582-kube-api-access-fwxfj\") pod \"certified-operators-kjcrd\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.098788 4772 scope.go:117] "RemoveContainer" containerID="873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543" Sep 30 17:26:05 crc kubenswrapper[4772]: E0930 17:26:05.099337 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543\": container with ID starting with 873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543 not found: ID does not exist" containerID="873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.099372 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543"} err="failed to get container status \"873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543\": rpc error: code = NotFound desc = could not find container \"873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543\": container with ID starting with 873d1c1c6f2556bb00326fea179a898f495c2ae1f86b6c0b3163896de38eb543 not found: ID does not exist" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.099396 4772 scope.go:117] "RemoveContainer" containerID="901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4" Sep 30 17:26:05 crc kubenswrapper[4772]: E0930 17:26:05.099788 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4\": container with ID starting with 901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4 not found: ID does not exist" containerID="901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.099813 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4"} err="failed to get container status \"901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4\": rpc error: code = NotFound desc = could not find container \"901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4\": container with ID starting with 901342f329e2119a09866cd39a331e292e025f395fb3d892c9042774ba6568d4 not found: ID does not exist" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.140250 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c0788e86-24b4-421d-98c9-12f0a8e52740" (UID: "c0788e86-24b4-421d-98c9-12f0a8e52740"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.154128 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0788e86-24b4-421d-98c9-12f0a8e52740-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.288591 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.325816 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.346974 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.366384 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.371343 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.376007 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rpsdz" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.376352 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.376474 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.376569 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.376704 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.376869 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.376629 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.386222 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.566345 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.566796 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.566843 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.567511 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.567553 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.567607 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvxh\" (UniqueName: \"kubernetes.io/projected/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-kube-api-access-rxvxh\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.567658 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.567695 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.567710 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.567739 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.567770 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669474 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669577 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669601 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669636 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669658 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669684 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvxh\" (UniqueName: \"kubernetes.io/projected/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-kube-api-access-rxvxh\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669705 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669726 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669743 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669764 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.669784 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.670205 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.671622 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.674973 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.675705 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.677842 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.678742 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.680229 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.680651 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.680802 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.683562 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.700875 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvxh\" (UniqueName: \"kubernetes.io/projected/cc65bd09-5d06-4b46-b8ca-c518e77acd9c-kube-api-access-rxvxh\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.730571 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"cc65bd09-5d06-4b46-b8ca-c518e77acd9c\") " pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.838353 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.863022 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kjcrd"] Sep 30 17:26:05 crc kubenswrapper[4772]: I0930 17:26:05.919322 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0788e86-24b4-421d-98c9-12f0a8e52740" path="/var/lib/kubelet/pods/c0788e86-24b4-421d-98c9-12f0a8e52740/volumes" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:05.983220 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:05.992598 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjcrd" event={"ID":"5a6064c6-a8b2-4901-ba91-344249cfe582","Type":"ContainerStarted","Data":"5d0377fe5b908e6dfacabd9f1edb6d4783536b9fba0286a18b483ff0a54402c7"} Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:05.994717 4772 generic.go:334] "Generic (PLEG): container finished" podID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" containerID="ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f" exitCode=0 Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:05.994758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9e90f254-e3e7-4c4f-acfe-1a251e7682df","Type":"ContainerDied","Data":"ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f"} Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:05.994775 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9e90f254-e3e7-4c4f-acfe-1a251e7682df","Type":"ContainerDied","Data":"b711f24cb10db43dd7ebc534e4b0d335b794329e1d4a97fa1f0d885d47c9a8d9"} Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:05.994791 4772 scope.go:117] "RemoveContainer" containerID="ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:05.994889 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.073176 4772 scope.go:117] "RemoveContainer" containerID="86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.092891 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-erlang-cookie\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.092979 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.093182 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9e90f254-e3e7-4c4f-acfe-1a251e7682df-erlang-cookie-secret\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.093215 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-config-data\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.093251 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-server-conf\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.093296 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-tls\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.093340 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9tk5\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-kube-api-access-r9tk5\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.093380 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-confd\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.093449 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-plugins-conf\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.093525 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9e90f254-e3e7-4c4f-acfe-1a251e7682df-pod-info\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.094212 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.095764 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-plugins\") pod \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\" (UID: \"9e90f254-e3e7-4c4f-acfe-1a251e7682df\") " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.097722 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.098164 4772 scope.go:117] "RemoveContainer" containerID="ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f" Sep 30 17:26:06 crc kubenswrapper[4772]: E0930 17:26:06.098828 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f\": container with ID starting with ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f not found: ID does not exist" containerID="ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.098872 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f"} err="failed to get container status \"ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f\": rpc error: code = NotFound desc = could not find container \"ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f\": container with ID starting with ffe90f1b0b6da6f564f2e0269311c0aed54d50c02a7d2582b209ae6b24b7763f not found: ID does not exist" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.098897 4772 scope.go:117] "RemoveContainer" containerID="86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.099577 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.099933 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: E0930 17:26:06.100006 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987\": container with ID starting with 86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987 not found: ID does not exist" containerID="86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.100032 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987"} err="failed to get container status \"86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987\": rpc error: code = NotFound desc = could not find container \"86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987\": container with ID starting with 86fb691b68652174776b68ffb1b925e1d7168bc10a77aebe3965baf2c7584987 not found: ID does not exist" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.104271 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e90f254-e3e7-4c4f-acfe-1a251e7682df-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.106726 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-kube-api-access-r9tk5" (OuterVolumeSpecName: "kube-api-access-r9tk5") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "kube-api-access-r9tk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.111278 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.111404 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.115692 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9e90f254-e3e7-4c4f-acfe-1a251e7682df-pod-info" (OuterVolumeSpecName: "pod-info") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.160935 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-config-data" (OuterVolumeSpecName: "config-data") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.199755 4772 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9e90f254-e3e7-4c4f-acfe-1a251e7682df-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.199788 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.199797 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.199811 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9tk5\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-kube-api-access-r9tk5\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.199820 4772 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.199853 4772 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9e90f254-e3e7-4c4f-acfe-1a251e7682df-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.199862 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.199888 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.223610 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-server-conf" (OuterVolumeSpecName: "server-conf") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.239105 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.262717 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9e90f254-e3e7-4c4f-acfe-1a251e7682df" (UID: "9e90f254-e3e7-4c4f-acfe-1a251e7682df"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.302381 4772 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9e90f254-e3e7-4c4f-acfe-1a251e7682df-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.302416 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9e90f254-e3e7-4c4f-acfe-1a251e7682df-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.302432 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.386380 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.400250 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.412791 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:26:06 crc kubenswrapper[4772]: E0930 17:26:06.415046 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" containerName="setup-container" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.415097 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" containerName="setup-container" Sep 30 17:26:06 crc kubenswrapper[4772]: E0930 17:26:06.415119 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" containerName="rabbitmq" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.415129 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" containerName="rabbitmq" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.415591 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" containerName="rabbitmq" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.416914 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.421668 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.421865 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.422036 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.422222 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.422319 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wlgst" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.422399 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.429477 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.431032 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.471125 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.608690 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/442ae296-125c-4c92-97b3-f2c04dac157e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.609000 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.609036 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq7wg\" (UniqueName: \"kubernetes.io/projected/442ae296-125c-4c92-97b3-f2c04dac157e-kube-api-access-lq7wg\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.609115 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.609156 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/442ae296-125c-4c92-97b3-f2c04dac157e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.609194 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/442ae296-125c-4c92-97b3-f2c04dac157e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.609249 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.609285 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/442ae296-125c-4c92-97b3-f2c04dac157e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.609337 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.609442 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/442ae296-125c-4c92-97b3-f2c04dac157e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.609478 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711190 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711288 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq7wg\" (UniqueName: \"kubernetes.io/projected/442ae296-125c-4c92-97b3-f2c04dac157e-kube-api-access-lq7wg\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711341 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711367 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/442ae296-125c-4c92-97b3-f2c04dac157e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711392 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/442ae296-125c-4c92-97b3-f2c04dac157e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711428 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711453 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/442ae296-125c-4c92-97b3-f2c04dac157e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711487 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711516 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/442ae296-125c-4c92-97b3-f2c04dac157e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711537 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.711555 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/442ae296-125c-4c92-97b3-f2c04dac157e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.712238 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.716465 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.716827 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.717666 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/442ae296-125c-4c92-97b3-f2c04dac157e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.717710 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/442ae296-125c-4c92-97b3-f2c04dac157e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.718365 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/442ae296-125c-4c92-97b3-f2c04dac157e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.718511 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.718963 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/442ae296-125c-4c92-97b3-f2c04dac157e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.736130 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq7wg\" (UniqueName: \"kubernetes.io/projected/442ae296-125c-4c92-97b3-f2c04dac157e-kube-api-access-lq7wg\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.749518 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/442ae296-125c-4c92-97b3-f2c04dac157e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.753348 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/442ae296-125c-4c92-97b3-f2c04dac157e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:06 crc kubenswrapper[4772]: I0930 17:26:06.775664 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"442ae296-125c-4c92-97b3-f2c04dac157e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:07 crc kubenswrapper[4772]: I0930 17:26:07.012503 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc65bd09-5d06-4b46-b8ca-c518e77acd9c","Type":"ContainerStarted","Data":"42e727428bc305625cc59aea4a6f53a89c2dd66493a4c3d46b476e3c1447c0fc"} Sep 30 17:26:07 crc kubenswrapper[4772]: I0930 17:26:07.012642 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc65bd09-5d06-4b46-b8ca-c518e77acd9c","Type":"ContainerStarted","Data":"d13140c86cd749c94e7bd78750ac8cd57eba52bb58c13dcc0058770bdecf0af4"} Sep 30 17:26:07 crc kubenswrapper[4772]: I0930 17:26:07.017077 4772 generic.go:334] "Generic (PLEG): container finished" podID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerID="14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071" exitCode=0 Sep 30 17:26:07 crc kubenswrapper[4772]: I0930 17:26:07.017257 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjcrd" event={"ID":"5a6064c6-a8b2-4901-ba91-344249cfe582","Type":"ContainerDied","Data":"14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071"} Sep 30 17:26:07 crc kubenswrapper[4772]: I0930 17:26:07.043159 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:07 crc kubenswrapper[4772]: I0930 17:26:07.910049 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e90f254-e3e7-4c4f-acfe-1a251e7682df" path="/var/lib/kubelet/pods/9e90f254-e3e7-4c4f-acfe-1a251e7682df/volumes" Sep 30 17:26:08 crc kubenswrapper[4772]: I0930 17:26:08.028243 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjcrd" event={"ID":"5a6064c6-a8b2-4901-ba91-344249cfe582","Type":"ContainerStarted","Data":"7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452"} Sep 30 17:26:08 crc kubenswrapper[4772]: I0930 17:26:08.110227 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:26:08 crc kubenswrapper[4772]: W0930 17:26:08.111464 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod442ae296_125c_4c92_97b3_f2c04dac157e.slice/crio-da24798ffb836ad4bc8c3695d35aee97de3aac6afdbbcd6fffb8a26a3bb3b4c2 WatchSource:0}: Error finding container da24798ffb836ad4bc8c3695d35aee97de3aac6afdbbcd6fffb8a26a3bb3b4c2: Status 404 returned error can't find the container with id da24798ffb836ad4bc8c3695d35aee97de3aac6afdbbcd6fffb8a26a3bb3b4c2 Sep 30 17:26:09 crc kubenswrapper[4772]: I0930 17:26:09.040693 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"442ae296-125c-4c92-97b3-f2c04dac157e","Type":"ContainerStarted","Data":"8df8d231ea288b3dd6a0bb447419c51dce8f3f3e52b9827b47c91e26c6923596"} Sep 30 17:26:09 crc kubenswrapper[4772]: I0930 17:26:09.041269 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"442ae296-125c-4c92-97b3-f2c04dac157e","Type":"ContainerStarted","Data":"da24798ffb836ad4bc8c3695d35aee97de3aac6afdbbcd6fffb8a26a3bb3b4c2"} Sep 30 17:26:09 crc kubenswrapper[4772]: I0930 17:26:09.042680 4772 generic.go:334] "Generic (PLEG): container finished" podID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerID="7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452" exitCode=0 Sep 30 17:26:09 crc kubenswrapper[4772]: I0930 17:26:09.042712 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjcrd" event={"ID":"5a6064c6-a8b2-4901-ba91-344249cfe582","Type":"ContainerDied","Data":"7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452"} Sep 30 17:26:10 crc kubenswrapper[4772]: I0930 17:26:10.053326 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjcrd" event={"ID":"5a6064c6-a8b2-4901-ba91-344249cfe582","Type":"ContainerStarted","Data":"f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6"} Sep 30 17:26:10 crc kubenswrapper[4772]: I0930 17:26:10.072742 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kjcrd" podStartSLOduration=3.5926644530000003 podStartE2EDuration="6.072722046s" podCreationTimestamp="2025-09-30 17:26:04 +0000 UTC" firstStartedPulling="2025-09-30 17:26:07.019669995 +0000 UTC m=+1467.926682826" lastFinishedPulling="2025-09-30 17:26:09.499727588 +0000 UTC m=+1470.406740419" observedRunningTime="2025-09-30 17:26:10.069392747 +0000 UTC m=+1470.976405608" watchObservedRunningTime="2025-09-30 17:26:10.072722046 +0000 UTC m=+1470.979734877" Sep 30 17:26:15 crc kubenswrapper[4772]: I0930 17:26:15.291399 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:15 crc kubenswrapper[4772]: I0930 17:26:15.291839 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:15 crc kubenswrapper[4772]: I0930 17:26:15.364125 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.167175 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.225192 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58cb99b967-z66c2"] Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.227376 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.240134 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58cb99b967-z66c2"] Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.273807 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.292723 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kjcrd"] Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.320711 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-dns-svc\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.320769 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-openstack-edpm-ipam\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.320803 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-config\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.320856 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-sb\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.320874 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-nb\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.320939 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4f49\" (UniqueName: \"kubernetes.io/projected/db26f2b0-60ae-41c8-a37d-f644a986c541-kube-api-access-s4f49\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.423114 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-sb\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.423169 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-nb\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.423215 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4f49\" (UniqueName: \"kubernetes.io/projected/db26f2b0-60ae-41c8-a37d-f644a986c541-kube-api-access-s4f49\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.423316 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-dns-svc\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.423339 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-openstack-edpm-ipam\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.423374 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-config\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.424508 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-sb\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.424607 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-nb\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.425410 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-config\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.425468 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-dns-svc\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.425523 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-openstack-edpm-ipam\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.444578 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4f49\" (UniqueName: \"kubernetes.io/projected/db26f2b0-60ae-41c8-a37d-f644a986c541-kube-api-access-s4f49\") pod \"dnsmasq-dns-58cb99b967-z66c2\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:16 crc kubenswrapper[4772]: I0930 17:26:16.588168 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:17 crc kubenswrapper[4772]: I0930 17:26:17.076427 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58cb99b967-z66c2"] Sep 30 17:26:17 crc kubenswrapper[4772]: I0930 17:26:17.125006 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" event={"ID":"db26f2b0-60ae-41c8-a37d-f644a986c541","Type":"ContainerStarted","Data":"e0ab9ed0290f6b9b2c7611c0032fabe2fc58790b7bd05c98f81b2dd658891ed4"} Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.135896 4772 generic.go:334] "Generic (PLEG): container finished" podID="db26f2b0-60ae-41c8-a37d-f644a986c541" containerID="61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6" exitCode=0 Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.135958 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" event={"ID":"db26f2b0-60ae-41c8-a37d-f644a986c541","Type":"ContainerDied","Data":"61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6"} Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.136673 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kjcrd" podUID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerName="registry-server" containerID="cri-o://f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6" gracePeriod=2 Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.628483 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gdgmg"] Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.630919 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.646439 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdgmg"] Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.746345 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.776458 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ckx6\" (UniqueName: \"kubernetes.io/projected/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-kube-api-access-7ckx6\") pod \"community-operators-gdgmg\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.776586 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-catalog-content\") pod \"community-operators-gdgmg\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.776665 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-utilities\") pod \"community-operators-gdgmg\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.878170 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwxfj\" (UniqueName: \"kubernetes.io/projected/5a6064c6-a8b2-4901-ba91-344249cfe582-kube-api-access-fwxfj\") pod \"5a6064c6-a8b2-4901-ba91-344249cfe582\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.878648 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-catalog-content\") pod \"5a6064c6-a8b2-4901-ba91-344249cfe582\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.878730 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-utilities\") pod \"5a6064c6-a8b2-4901-ba91-344249cfe582\" (UID: \"5a6064c6-a8b2-4901-ba91-344249cfe582\") " Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.879034 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ckx6\" (UniqueName: \"kubernetes.io/projected/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-kube-api-access-7ckx6\") pod \"community-operators-gdgmg\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.879147 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-catalog-content\") pod \"community-operators-gdgmg\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.879222 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-utilities\") pod \"community-operators-gdgmg\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.879677 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-utilities\") pod \"community-operators-gdgmg\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.879693 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-utilities" (OuterVolumeSpecName: "utilities") pod "5a6064c6-a8b2-4901-ba91-344249cfe582" (UID: "5a6064c6-a8b2-4901-ba91-344249cfe582"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.879760 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-catalog-content\") pod \"community-operators-gdgmg\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.884480 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a6064c6-a8b2-4901-ba91-344249cfe582-kube-api-access-fwxfj" (OuterVolumeSpecName: "kube-api-access-fwxfj") pod "5a6064c6-a8b2-4901-ba91-344249cfe582" (UID: "5a6064c6-a8b2-4901-ba91-344249cfe582"). InnerVolumeSpecName "kube-api-access-fwxfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.907787 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ckx6\" (UniqueName: \"kubernetes.io/projected/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-kube-api-access-7ckx6\") pod \"community-operators-gdgmg\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.939088 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a6064c6-a8b2-4901-ba91-344249cfe582" (UID: "5a6064c6-a8b2-4901-ba91-344249cfe582"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.981289 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwxfj\" (UniqueName: \"kubernetes.io/projected/5a6064c6-a8b2-4901-ba91-344249cfe582-kube-api-access-fwxfj\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.981324 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:18 crc kubenswrapper[4772]: I0930 17:26:18.981337 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6064c6-a8b2-4901-ba91-344249cfe582-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.042659 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.157252 4772 generic.go:334] "Generic (PLEG): container finished" podID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerID="f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6" exitCode=0 Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.157331 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjcrd" event={"ID":"5a6064c6-a8b2-4901-ba91-344249cfe582","Type":"ContainerDied","Data":"f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6"} Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.157360 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjcrd" event={"ID":"5a6064c6-a8b2-4901-ba91-344249cfe582","Type":"ContainerDied","Data":"5d0377fe5b908e6dfacabd9f1edb6d4783536b9fba0286a18b483ff0a54402c7"} Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.157367 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjcrd" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.157379 4772 scope.go:117] "RemoveContainer" containerID="f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.162270 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" event={"ID":"db26f2b0-60ae-41c8-a37d-f644a986c541","Type":"ContainerStarted","Data":"02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796"} Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.162495 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.189303 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" podStartSLOduration=3.189286159 podStartE2EDuration="3.189286159s" podCreationTimestamp="2025-09-30 17:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:26:19.186654219 +0000 UTC m=+1480.093667050" watchObservedRunningTime="2025-09-30 17:26:19.189286159 +0000 UTC m=+1480.096298980" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.197369 4772 scope.go:117] "RemoveContainer" containerID="7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.218819 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kjcrd"] Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.225876 4772 scope.go:117] "RemoveContainer" containerID="14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.238090 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kjcrd"] Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.275433 4772 scope.go:117] "RemoveContainer" containerID="f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6" Sep 30 17:26:19 crc kubenswrapper[4772]: E0930 17:26:19.277692 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6\": container with ID starting with f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6 not found: ID does not exist" containerID="f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.277745 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6"} err="failed to get container status \"f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6\": rpc error: code = NotFound desc = could not find container \"f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6\": container with ID starting with f3472f24e6378fca0d1de3bf3dd7ff2e76ebdfa3adad3d71338fb230236e81e6 not found: ID does not exist" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.277784 4772 scope.go:117] "RemoveContainer" containerID="7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452" Sep 30 17:26:19 crc kubenswrapper[4772]: E0930 17:26:19.278192 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452\": container with ID starting with 7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452 not found: ID does not exist" containerID="7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.278241 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452"} err="failed to get container status \"7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452\": rpc error: code = NotFound desc = could not find container \"7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452\": container with ID starting with 7d0746ac3cf4426de6498e5d5528ef04af9651456a5a3f4227004712003f5452 not found: ID does not exist" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.278282 4772 scope.go:117] "RemoveContainer" containerID="14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071" Sep 30 17:26:19 crc kubenswrapper[4772]: E0930 17:26:19.289920 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071\": container with ID starting with 14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071 not found: ID does not exist" containerID="14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.289959 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071"} err="failed to get container status \"14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071\": rpc error: code = NotFound desc = could not find container \"14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071\": container with ID starting with 14c1f5c3f3a7e0e3ef5425066c7f1b8377431acd91c834954f4e308651ad5071 not found: ID does not exist" Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.616463 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdgmg"] Sep 30 17:26:19 crc kubenswrapper[4772]: I0930 17:26:19.912169 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a6064c6-a8b2-4901-ba91-344249cfe582" path="/var/lib/kubelet/pods/5a6064c6-a8b2-4901-ba91-344249cfe582/volumes" Sep 30 17:26:20 crc kubenswrapper[4772]: I0930 17:26:20.175732 4772 generic.go:334] "Generic (PLEG): container finished" podID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerID="42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2" exitCode=0 Sep 30 17:26:20 crc kubenswrapper[4772]: I0930 17:26:20.175892 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgmg" event={"ID":"ff5b8065-3efa-41fe-aec7-bb11c63f4dde","Type":"ContainerDied","Data":"42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2"} Sep 30 17:26:20 crc kubenswrapper[4772]: I0930 17:26:20.175996 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgmg" event={"ID":"ff5b8065-3efa-41fe-aec7-bb11c63f4dde","Type":"ContainerStarted","Data":"620903306b73950cc38370b653ce85d051e2fb06e18d99825637fd5cf4f8b223"} Sep 30 17:26:22 crc kubenswrapper[4772]: I0930 17:26:22.197934 4772 generic.go:334] "Generic (PLEG): container finished" podID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerID="7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187" exitCode=0 Sep 30 17:26:22 crc kubenswrapper[4772]: I0930 17:26:22.198005 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgmg" event={"ID":"ff5b8065-3efa-41fe-aec7-bb11c63f4dde","Type":"ContainerDied","Data":"7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187"} Sep 30 17:26:24 crc kubenswrapper[4772]: I0930 17:26:24.219504 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgmg" event={"ID":"ff5b8065-3efa-41fe-aec7-bb11c63f4dde","Type":"ContainerStarted","Data":"59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1"} Sep 30 17:26:24 crc kubenswrapper[4772]: I0930 17:26:24.242878 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gdgmg" podStartSLOduration=2.698601761 podStartE2EDuration="6.242859128s" podCreationTimestamp="2025-09-30 17:26:18 +0000 UTC" firstStartedPulling="2025-09-30 17:26:20.177316477 +0000 UTC m=+1481.084329308" lastFinishedPulling="2025-09-30 17:26:23.721573834 +0000 UTC m=+1484.628586675" observedRunningTime="2025-09-30 17:26:24.238898973 +0000 UTC m=+1485.145911814" watchObservedRunningTime="2025-09-30 17:26:24.242859128 +0000 UTC m=+1485.149871959" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.589347 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.646710 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd69b5dbc-nw55h"] Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.646979 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" podUID="b5d55792-e918-48b2-ab18-344dbd67c4b7" containerName="dnsmasq-dns" containerID="cri-o://0ad2b440d72c2f98720dc865977acb06a6446d68be469b949818dfc50c4cb4eb" gracePeriod=10 Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.821895 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5498b49c99-7mbh2"] Sep 30 17:26:26 crc kubenswrapper[4772]: E0930 17:26:26.822632 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerName="extract-utilities" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.822650 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerName="extract-utilities" Sep 30 17:26:26 crc kubenswrapper[4772]: E0930 17:26:26.822667 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerName="registry-server" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.822673 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerName="registry-server" Sep 30 17:26:26 crc kubenswrapper[4772]: E0930 17:26:26.822691 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerName="extract-content" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.822697 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerName="extract-content" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.822868 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6064c6-a8b2-4901-ba91-344249cfe582" containerName="registry-server" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.823967 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.837969 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5498b49c99-7mbh2"] Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.949356 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-ovsdbserver-sb\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.949404 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-openstack-edpm-ipam\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.949440 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-config\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.949605 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mblqd\" (UniqueName: \"kubernetes.io/projected/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-kube-api-access-mblqd\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.949829 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-ovsdbserver-nb\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:26 crc kubenswrapper[4772]: I0930 17:26:26.949945 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-dns-svc\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.053428 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-ovsdbserver-sb\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.053493 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-openstack-edpm-ipam\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.053534 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-config\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.053585 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mblqd\" (UniqueName: \"kubernetes.io/projected/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-kube-api-access-mblqd\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.053633 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-ovsdbserver-nb\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.053672 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-dns-svc\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.054513 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-ovsdbserver-sb\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.055005 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-config\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.055030 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-openstack-edpm-ipam\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.055307 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-dns-svc\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.055365 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-ovsdbserver-nb\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.073759 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mblqd\" (UniqueName: \"kubernetes.io/projected/51a7ec88-f2d8-434d-88ea-3e3ce6c639c5-kube-api-access-mblqd\") pod \"dnsmasq-dns-5498b49c99-7mbh2\" (UID: \"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5\") " pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.184361 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.254374 4772 generic.go:334] "Generic (PLEG): container finished" podID="b5d55792-e918-48b2-ab18-344dbd67c4b7" containerID="0ad2b440d72c2f98720dc865977acb06a6446d68be469b949818dfc50c4cb4eb" exitCode=0 Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.254423 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" event={"ID":"b5d55792-e918-48b2-ab18-344dbd67c4b7","Type":"ContainerDied","Data":"0ad2b440d72c2f98720dc865977acb06a6446d68be469b949818dfc50c4cb4eb"} Sep 30 17:26:27 crc kubenswrapper[4772]: W0930 17:26:27.728202 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51a7ec88_f2d8_434d_88ea_3e3ce6c639c5.slice/crio-ee6625b5584c7138d3140c99ede944098456aea819ee2d69018e274dc1f9609c WatchSource:0}: Error finding container ee6625b5584c7138d3140c99ede944098456aea819ee2d69018e274dc1f9609c: Status 404 returned error can't find the container with id ee6625b5584c7138d3140c99ede944098456aea819ee2d69018e274dc1f9609c Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.729043 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5498b49c99-7mbh2"] Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.896393 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.980379 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-sb\") pod \"b5d55792-e918-48b2-ab18-344dbd67c4b7\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.980452 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-dns-svc\") pod \"b5d55792-e918-48b2-ab18-344dbd67c4b7\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.980486 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwdpn\" (UniqueName: \"kubernetes.io/projected/b5d55792-e918-48b2-ab18-344dbd67c4b7-kube-api-access-vwdpn\") pod \"b5d55792-e918-48b2-ab18-344dbd67c4b7\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.982583 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-nb\") pod \"b5d55792-e918-48b2-ab18-344dbd67c4b7\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.982679 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-config\") pod \"b5d55792-e918-48b2-ab18-344dbd67c4b7\" (UID: \"b5d55792-e918-48b2-ab18-344dbd67c4b7\") " Sep 30 17:26:27 crc kubenswrapper[4772]: I0930 17:26:27.990230 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d55792-e918-48b2-ab18-344dbd67c4b7-kube-api-access-vwdpn" (OuterVolumeSpecName: "kube-api-access-vwdpn") pod "b5d55792-e918-48b2-ab18-344dbd67c4b7" (UID: "b5d55792-e918-48b2-ab18-344dbd67c4b7"). InnerVolumeSpecName "kube-api-access-vwdpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.035247 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5d55792-e918-48b2-ab18-344dbd67c4b7" (UID: "b5d55792-e918-48b2-ab18-344dbd67c4b7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.039297 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5d55792-e918-48b2-ab18-344dbd67c4b7" (UID: "b5d55792-e918-48b2-ab18-344dbd67c4b7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.042823 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-config" (OuterVolumeSpecName: "config") pod "b5d55792-e918-48b2-ab18-344dbd67c4b7" (UID: "b5d55792-e918-48b2-ab18-344dbd67c4b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.043482 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5d55792-e918-48b2-ab18-344dbd67c4b7" (UID: "b5d55792-e918-48b2-ab18-344dbd67c4b7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.086272 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.086320 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.086336 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwdpn\" (UniqueName: \"kubernetes.io/projected/b5d55792-e918-48b2-ab18-344dbd67c4b7-kube-api-access-vwdpn\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.086351 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.086364 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d55792-e918-48b2-ab18-344dbd67c4b7-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.266115 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.266113 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd69b5dbc-nw55h" event={"ID":"b5d55792-e918-48b2-ab18-344dbd67c4b7","Type":"ContainerDied","Data":"1e372803af5f129e441cc89756d92a9add73f57d97c83ee72c9e14beee7d3c91"} Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.266742 4772 scope.go:117] "RemoveContainer" containerID="0ad2b440d72c2f98720dc865977acb06a6446d68be469b949818dfc50c4cb4eb" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.268483 4772 generic.go:334] "Generic (PLEG): container finished" podID="51a7ec88-f2d8-434d-88ea-3e3ce6c639c5" containerID="671b7aa508772670ea7ab5332af8425b1becb67c220f27088cbed4ebe0d7fc56" exitCode=0 Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.268520 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" event={"ID":"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5","Type":"ContainerDied","Data":"671b7aa508772670ea7ab5332af8425b1becb67c220f27088cbed4ebe0d7fc56"} Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.268572 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" event={"ID":"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5","Type":"ContainerStarted","Data":"ee6625b5584c7138d3140c99ede944098456aea819ee2d69018e274dc1f9609c"} Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.318176 4772 scope.go:117] "RemoveContainer" containerID="9260a303bd7cbea98ec5d7d41b6047de570e4fc2efe778b0b02b3ea270899c01" Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.350688 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd69b5dbc-nw55h"] Sep 30 17:26:28 crc kubenswrapper[4772]: I0930 17:26:28.363421 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fd69b5dbc-nw55h"] Sep 30 17:26:29 crc kubenswrapper[4772]: I0930 17:26:29.045069 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:29 crc kubenswrapper[4772]: I0930 17:26:29.046281 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:29 crc kubenswrapper[4772]: I0930 17:26:29.094031 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:29 crc kubenswrapper[4772]: I0930 17:26:29.279625 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" event={"ID":"51a7ec88-f2d8-434d-88ea-3e3ce6c639c5","Type":"ContainerStarted","Data":"57aab416aa2cf4c61eb2d7f1d21f35612d3feab86fea651ff284af65780f68eb"} Sep 30 17:26:29 crc kubenswrapper[4772]: I0930 17:26:29.310416 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" podStartSLOduration=3.310393788 podStartE2EDuration="3.310393788s" podCreationTimestamp="2025-09-30 17:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:26:29.301587605 +0000 UTC m=+1490.208600436" watchObservedRunningTime="2025-09-30 17:26:29.310393788 +0000 UTC m=+1490.217406619" Sep 30 17:26:29 crc kubenswrapper[4772]: I0930 17:26:29.331017 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:29 crc kubenswrapper[4772]: I0930 17:26:29.377868 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gdgmg"] Sep 30 17:26:29 crc kubenswrapper[4772]: I0930 17:26:29.919078 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d55792-e918-48b2-ab18-344dbd67c4b7" path="/var/lib/kubelet/pods/b5d55792-e918-48b2-ab18-344dbd67c4b7/volumes" Sep 30 17:26:30 crc kubenswrapper[4772]: I0930 17:26:30.289014 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:31 crc kubenswrapper[4772]: I0930 17:26:31.319824 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gdgmg" podUID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerName="registry-server" containerID="cri-o://59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1" gracePeriod=2 Sep 30 17:26:31 crc kubenswrapper[4772]: I0930 17:26:31.827408 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:31 crc kubenswrapper[4772]: I0930 17:26:31.968957 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ckx6\" (UniqueName: \"kubernetes.io/projected/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-kube-api-access-7ckx6\") pod \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " Sep 30 17:26:31 crc kubenswrapper[4772]: I0930 17:26:31.969067 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-catalog-content\") pod \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " Sep 30 17:26:31 crc kubenswrapper[4772]: I0930 17:26:31.969344 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-utilities\") pod \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\" (UID: \"ff5b8065-3efa-41fe-aec7-bb11c63f4dde\") " Sep 30 17:26:31 crc kubenswrapper[4772]: I0930 17:26:31.970656 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-utilities" (OuterVolumeSpecName: "utilities") pod "ff5b8065-3efa-41fe-aec7-bb11c63f4dde" (UID: "ff5b8065-3efa-41fe-aec7-bb11c63f4dde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:31 crc kubenswrapper[4772]: I0930 17:26:31.976660 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-kube-api-access-7ckx6" (OuterVolumeSpecName: "kube-api-access-7ckx6") pod "ff5b8065-3efa-41fe-aec7-bb11c63f4dde" (UID: "ff5b8065-3efa-41fe-aec7-bb11c63f4dde"). InnerVolumeSpecName "kube-api-access-7ckx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.071526 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ckx6\" (UniqueName: \"kubernetes.io/projected/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-kube-api-access-7ckx6\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.071797 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.330741 4772 generic.go:334] "Generic (PLEG): container finished" podID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerID="59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1" exitCode=0 Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.330946 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgmg" event={"ID":"ff5b8065-3efa-41fe-aec7-bb11c63f4dde","Type":"ContainerDied","Data":"59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1"} Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.331894 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgmg" event={"ID":"ff5b8065-3efa-41fe-aec7-bb11c63f4dde","Type":"ContainerDied","Data":"620903306b73950cc38370b653ce85d051e2fb06e18d99825637fd5cf4f8b223"} Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.331965 4772 scope.go:117] "RemoveContainer" containerID="59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.331038 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdgmg" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.536601 4772 scope.go:117] "RemoveContainer" containerID="7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.564400 4772 scope.go:117] "RemoveContainer" containerID="42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.604762 4772 scope.go:117] "RemoveContainer" containerID="59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1" Sep 30 17:26:32 crc kubenswrapper[4772]: E0930 17:26:32.605250 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1\": container with ID starting with 59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1 not found: ID does not exist" containerID="59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.605304 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1"} err="failed to get container status \"59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1\": rpc error: code = NotFound desc = could not find container \"59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1\": container with ID starting with 59cd96cd035551fbce0cb2023ec47e707b2e129dce3a3f5ecec228dce7b1c5f1 not found: ID does not exist" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.605338 4772 scope.go:117] "RemoveContainer" containerID="7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187" Sep 30 17:26:32 crc kubenswrapper[4772]: E0930 17:26:32.606260 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187\": container with ID starting with 7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187 not found: ID does not exist" containerID="7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.606378 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187"} err="failed to get container status \"7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187\": rpc error: code = NotFound desc = could not find container \"7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187\": container with ID starting with 7d20cd88c7598aab42a166cb88c157fb30e0d2af360a5b78d1ffa02753d54187 not found: ID does not exist" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.606471 4772 scope.go:117] "RemoveContainer" containerID="42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2" Sep 30 17:26:32 crc kubenswrapper[4772]: E0930 17:26:32.607024 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2\": container with ID starting with 42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2 not found: ID does not exist" containerID="42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.607093 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2"} err="failed to get container status \"42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2\": rpc error: code = NotFound desc = could not find container \"42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2\": container with ID starting with 42ce93596b02ae0540e2342a1745aedc224dee612d306fcf8f3c3bac4ffa25b2 not found: ID does not exist" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.683239 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff5b8065-3efa-41fe-aec7-bb11c63f4dde" (UID: "ff5b8065-3efa-41fe-aec7-bb11c63f4dde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.684719 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5b8065-3efa-41fe-aec7-bb11c63f4dde-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.980449 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gdgmg"] Sep 30 17:26:32 crc kubenswrapper[4772]: I0930 17:26:32.997017 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gdgmg"] Sep 30 17:26:33 crc kubenswrapper[4772]: I0930 17:26:33.910792 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" path="/var/lib/kubelet/pods/ff5b8065-3efa-41fe-aec7-bb11c63f4dde/volumes" Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.186268 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5498b49c99-7mbh2" Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.252595 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58cb99b967-z66c2"] Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.253486 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" podUID="db26f2b0-60ae-41c8-a37d-f644a986c541" containerName="dnsmasq-dns" containerID="cri-o://02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796" gracePeriod=10 Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.384393 4772 generic.go:334] "Generic (PLEG): container finished" podID="cc65bd09-5d06-4b46-b8ca-c518e77acd9c" containerID="42e727428bc305625cc59aea4a6f53a89c2dd66493a4c3d46b476e3c1447c0fc" exitCode=0 Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.384458 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc65bd09-5d06-4b46-b8ca-c518e77acd9c","Type":"ContainerDied","Data":"42e727428bc305625cc59aea4a6f53a89c2dd66493a4c3d46b476e3c1447c0fc"} Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.809740 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.898518 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-openstack-edpm-ipam\") pod \"db26f2b0-60ae-41c8-a37d-f644a986c541\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.898593 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-sb\") pod \"db26f2b0-60ae-41c8-a37d-f644a986c541\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.898641 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-nb\") pod \"db26f2b0-60ae-41c8-a37d-f644a986c541\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.898698 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-config\") pod \"db26f2b0-60ae-41c8-a37d-f644a986c541\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.898738 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-dns-svc\") pod \"db26f2b0-60ae-41c8-a37d-f644a986c541\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.898910 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4f49\" (UniqueName: \"kubernetes.io/projected/db26f2b0-60ae-41c8-a37d-f644a986c541-kube-api-access-s4f49\") pod \"db26f2b0-60ae-41c8-a37d-f644a986c541\" (UID: \"db26f2b0-60ae-41c8-a37d-f644a986c541\") " Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.905035 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db26f2b0-60ae-41c8-a37d-f644a986c541-kube-api-access-s4f49" (OuterVolumeSpecName: "kube-api-access-s4f49") pod "db26f2b0-60ae-41c8-a37d-f644a986c541" (UID: "db26f2b0-60ae-41c8-a37d-f644a986c541"). InnerVolumeSpecName "kube-api-access-s4f49". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.958308 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "db26f2b0-60ae-41c8-a37d-f644a986c541" (UID: "db26f2b0-60ae-41c8-a37d-f644a986c541"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.958664 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-config" (OuterVolumeSpecName: "config") pod "db26f2b0-60ae-41c8-a37d-f644a986c541" (UID: "db26f2b0-60ae-41c8-a37d-f644a986c541"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.962780 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db26f2b0-60ae-41c8-a37d-f644a986c541" (UID: "db26f2b0-60ae-41c8-a37d-f644a986c541"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.963700 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "db26f2b0-60ae-41c8-a37d-f644a986c541" (UID: "db26f2b0-60ae-41c8-a37d-f644a986c541"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:37 crc kubenswrapper[4772]: I0930 17:26:37.967489 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "db26f2b0-60ae-41c8-a37d-f644a986c541" (UID: "db26f2b0-60ae-41c8-a37d-f644a986c541"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.001341 4772 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.001376 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.001385 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.001394 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.001403 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db26f2b0-60ae-41c8-a37d-f644a986c541-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.001412 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4f49\" (UniqueName: \"kubernetes.io/projected/db26f2b0-60ae-41c8-a37d-f644a986c541-kube-api-access-s4f49\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.397406 4772 generic.go:334] "Generic (PLEG): container finished" podID="db26f2b0-60ae-41c8-a37d-f644a986c541" containerID="02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796" exitCode=0 Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.397512 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.397533 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" event={"ID":"db26f2b0-60ae-41c8-a37d-f644a986c541","Type":"ContainerDied","Data":"02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796"} Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.398250 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cb99b967-z66c2" event={"ID":"db26f2b0-60ae-41c8-a37d-f644a986c541","Type":"ContainerDied","Data":"e0ab9ed0290f6b9b2c7611c0032fabe2fc58790b7bd05c98f81b2dd658891ed4"} Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.398288 4772 scope.go:117] "RemoveContainer" containerID="02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.401422 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc65bd09-5d06-4b46-b8ca-c518e77acd9c","Type":"ContainerStarted","Data":"86c7819ce80ba9626b9403adcf30a8d7fc2ee2f6a4e3f6a6fda70243c457720d"} Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.401772 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.427548 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=33.427525144 podStartE2EDuration="33.427525144s" podCreationTimestamp="2025-09-30 17:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:26:38.423923459 +0000 UTC m=+1499.330936310" watchObservedRunningTime="2025-09-30 17:26:38.427525144 +0000 UTC m=+1499.334537975" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.457693 4772 scope.go:117] "RemoveContainer" containerID="61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.478898 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58cb99b967-z66c2"] Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.487316 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58cb99b967-z66c2"] Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.498948 4772 scope.go:117] "RemoveContainer" containerID="02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796" Sep 30 17:26:38 crc kubenswrapper[4772]: E0930 17:26:38.499422 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796\": container with ID starting with 02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796 not found: ID does not exist" containerID="02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.499462 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796"} err="failed to get container status \"02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796\": rpc error: code = NotFound desc = could not find container \"02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796\": container with ID starting with 02015ae2c84705f010c1ff4e221b0b6624240fe401e2498236c9839a95696796 not found: ID does not exist" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.499489 4772 scope.go:117] "RemoveContainer" containerID="61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6" Sep 30 17:26:38 crc kubenswrapper[4772]: E0930 17:26:38.499869 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6\": container with ID starting with 61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6 not found: ID does not exist" containerID="61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6" Sep 30 17:26:38 crc kubenswrapper[4772]: I0930 17:26:38.499898 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6"} err="failed to get container status \"61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6\": rpc error: code = NotFound desc = could not find container \"61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6\": container with ID starting with 61810896f3a6300093e9f1f6e86348920b81ac1a7bd4a8d6c0e34d6f36e007b6 not found: ID does not exist" Sep 30 17:26:39 crc kubenswrapper[4772]: I0930 17:26:39.435359 4772 generic.go:334] "Generic (PLEG): container finished" podID="442ae296-125c-4c92-97b3-f2c04dac157e" containerID="8df8d231ea288b3dd6a0bb447419c51dce8f3f3e52b9827b47c91e26c6923596" exitCode=0 Sep 30 17:26:39 crc kubenswrapper[4772]: I0930 17:26:39.436413 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"442ae296-125c-4c92-97b3-f2c04dac157e","Type":"ContainerDied","Data":"8df8d231ea288b3dd6a0bb447419c51dce8f3f3e52b9827b47c91e26c6923596"} Sep 30 17:26:39 crc kubenswrapper[4772]: I0930 17:26:39.911800 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db26f2b0-60ae-41c8-a37d-f644a986c541" path="/var/lib/kubelet/pods/db26f2b0-60ae-41c8-a37d-f644a986c541/volumes" Sep 30 17:26:40 crc kubenswrapper[4772]: I0930 17:26:40.454540 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"442ae296-125c-4c92-97b3-f2c04dac157e","Type":"ContainerStarted","Data":"1d13dbd2750e595f430f77433e56debd345339261209a26b969311fa3ec68f89"} Sep 30 17:26:40 crc kubenswrapper[4772]: I0930 17:26:40.455998 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:40 crc kubenswrapper[4772]: I0930 17:26:40.485492 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.485471289 podStartE2EDuration="34.485471289s" podCreationTimestamp="2025-09-30 17:26:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:26:40.477103187 +0000 UTC m=+1501.384116038" watchObservedRunningTime="2025-09-30 17:26:40.485471289 +0000 UTC m=+1501.392484110" Sep 30 17:26:41 crc kubenswrapper[4772]: I0930 17:26:41.664198 4772 scope.go:117] "RemoveContainer" containerID="4a8b1a3f5e9f1ea0ffcdbdf0e8eef9da02c973832222958146e309f7e5ab7471" Sep 30 17:26:41 crc kubenswrapper[4772]: I0930 17:26:41.719682 4772 scope.go:117] "RemoveContainer" containerID="120b6fbb659512b040617459d7216ef829d6dfd06f7fdeae10245601f3a3a6c9" Sep 30 17:26:41 crc kubenswrapper[4772]: I0930 17:26:41.757765 4772 scope.go:117] "RemoveContainer" containerID="1a4133fb7ad3b9cf4ed58bf53c24a7060075d68fbba85ede3c954f922dc6e8e5" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.794662 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wzblg"] Sep 30 17:26:45 crc kubenswrapper[4772]: E0930 17:26:45.795695 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d55792-e918-48b2-ab18-344dbd67c4b7" containerName="init" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.795715 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d55792-e918-48b2-ab18-344dbd67c4b7" containerName="init" Sep 30 17:26:45 crc kubenswrapper[4772]: E0930 17:26:45.795736 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d55792-e918-48b2-ab18-344dbd67c4b7" containerName="dnsmasq-dns" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.795744 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d55792-e918-48b2-ab18-344dbd67c4b7" containerName="dnsmasq-dns" Sep 30 17:26:45 crc kubenswrapper[4772]: E0930 17:26:45.795760 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerName="registry-server" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.795768 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerName="registry-server" Sep 30 17:26:45 crc kubenswrapper[4772]: E0930 17:26:45.795796 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerName="extract-utilities" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.795805 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerName="extract-utilities" Sep 30 17:26:45 crc kubenswrapper[4772]: E0930 17:26:45.795816 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerName="extract-content" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.795822 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerName="extract-content" Sep 30 17:26:45 crc kubenswrapper[4772]: E0930 17:26:45.795846 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db26f2b0-60ae-41c8-a37d-f644a986c541" containerName="dnsmasq-dns" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.795855 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="db26f2b0-60ae-41c8-a37d-f644a986c541" containerName="dnsmasq-dns" Sep 30 17:26:45 crc kubenswrapper[4772]: E0930 17:26:45.795869 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db26f2b0-60ae-41c8-a37d-f644a986c541" containerName="init" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.795875 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="db26f2b0-60ae-41c8-a37d-f644a986c541" containerName="init" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.796112 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d55792-e918-48b2-ab18-344dbd67c4b7" containerName="dnsmasq-dns" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.796146 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5b8065-3efa-41fe-aec7-bb11c63f4dde" containerName="registry-server" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.796154 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="db26f2b0-60ae-41c8-a37d-f644a986c541" containerName="dnsmasq-dns" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.799772 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.823552 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzblg"] Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.878537 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-catalog-content\") pod \"redhat-marketplace-wzblg\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.878686 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-utilities\") pod \"redhat-marketplace-wzblg\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.879374 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wttf7\" (UniqueName: \"kubernetes.io/projected/f178a32d-b123-422b-bdb7-fbdc0be3a62d-kube-api-access-wttf7\") pod \"redhat-marketplace-wzblg\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.981381 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wttf7\" (UniqueName: \"kubernetes.io/projected/f178a32d-b123-422b-bdb7-fbdc0be3a62d-kube-api-access-wttf7\") pod \"redhat-marketplace-wzblg\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.981469 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-catalog-content\") pod \"redhat-marketplace-wzblg\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.981515 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-utilities\") pod \"redhat-marketplace-wzblg\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.982215 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-utilities\") pod \"redhat-marketplace-wzblg\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:45 crc kubenswrapper[4772]: I0930 17:26:45.982601 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-catalog-content\") pod \"redhat-marketplace-wzblg\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:46 crc kubenswrapper[4772]: I0930 17:26:46.010412 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wttf7\" (UniqueName: \"kubernetes.io/projected/f178a32d-b123-422b-bdb7-fbdc0be3a62d-kube-api-access-wttf7\") pod \"redhat-marketplace-wzblg\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:46 crc kubenswrapper[4772]: I0930 17:26:46.124882 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:46 crc kubenswrapper[4772]: I0930 17:26:46.473491 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzblg"] Sep 30 17:26:46 crc kubenswrapper[4772]: W0930 17:26:46.481787 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf178a32d_b123_422b_bdb7_fbdc0be3a62d.slice/crio-ad82194c03226af6c6d7cf1aebda990ebf55afd5e13ef83608dcbf4e54a15437 WatchSource:0}: Error finding container ad82194c03226af6c6d7cf1aebda990ebf55afd5e13ef83608dcbf4e54a15437: Status 404 returned error can't find the container with id ad82194c03226af6c6d7cf1aebda990ebf55afd5e13ef83608dcbf4e54a15437 Sep 30 17:26:46 crc kubenswrapper[4772]: I0930 17:26:46.522828 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzblg" event={"ID":"f178a32d-b123-422b-bdb7-fbdc0be3a62d","Type":"ContainerStarted","Data":"ad82194c03226af6c6d7cf1aebda990ebf55afd5e13ef83608dcbf4e54a15437"} Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.538168 4772 generic.go:334] "Generic (PLEG): container finished" podID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerID="59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8" exitCode=0 Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.538255 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzblg" event={"ID":"f178a32d-b123-422b-bdb7-fbdc0be3a62d","Type":"ContainerDied","Data":"59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8"} Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.640895 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl"] Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.642538 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.646575 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.646626 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.647007 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.647857 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl"] Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.648647 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.724408 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7b7w\" (UniqueName: \"kubernetes.io/projected/addb4fb1-d812-4472-a08e-742c97c9b6d2-kube-api-access-j7b7w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.724897 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.725117 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.725254 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.828111 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.828208 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.828290 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7b7w\" (UniqueName: \"kubernetes.io/projected/addb4fb1-d812-4472-a08e-742c97c9b6d2-kube-api-access-j7b7w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.828377 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.836948 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.837487 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.837718 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.854258 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7b7w\" (UniqueName: \"kubernetes.io/projected/addb4fb1-d812-4472-a08e-742c97c9b6d2-kube-api-access-j7b7w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:47 crc kubenswrapper[4772]: I0930 17:26:47.968806 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:26:48 crc kubenswrapper[4772]: I0930 17:26:48.584191 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl"] Sep 30 17:26:48 crc kubenswrapper[4772]: W0930 17:26:48.600431 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaddb4fb1_d812_4472_a08e_742c97c9b6d2.slice/crio-02d0b3cfd79caba9d47cc25ae0cb968cbab531fafcc2e0a907bc151e0b1fcd6c WatchSource:0}: Error finding container 02d0b3cfd79caba9d47cc25ae0cb968cbab531fafcc2e0a907bc151e0b1fcd6c: Status 404 returned error can't find the container with id 02d0b3cfd79caba9d47cc25ae0cb968cbab531fafcc2e0a907bc151e0b1fcd6c Sep 30 17:26:49 crc kubenswrapper[4772]: I0930 17:26:49.564987 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" event={"ID":"addb4fb1-d812-4472-a08e-742c97c9b6d2","Type":"ContainerStarted","Data":"02d0b3cfd79caba9d47cc25ae0cb968cbab531fafcc2e0a907bc151e0b1fcd6c"} Sep 30 17:26:49 crc kubenswrapper[4772]: I0930 17:26:49.568142 4772 generic.go:334] "Generic (PLEG): container finished" podID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerID="9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973" exitCode=0 Sep 30 17:26:49 crc kubenswrapper[4772]: I0930 17:26:49.568216 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzblg" event={"ID":"f178a32d-b123-422b-bdb7-fbdc0be3a62d","Type":"ContainerDied","Data":"9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973"} Sep 30 17:26:50 crc kubenswrapper[4772]: I0930 17:26:50.591471 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzblg" event={"ID":"f178a32d-b123-422b-bdb7-fbdc0be3a62d","Type":"ContainerStarted","Data":"f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0"} Sep 30 17:26:50 crc kubenswrapper[4772]: I0930 17:26:50.619406 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wzblg" podStartSLOduration=3.105389954 podStartE2EDuration="5.619384049s" podCreationTimestamp="2025-09-30 17:26:45 +0000 UTC" firstStartedPulling="2025-09-30 17:26:47.551676289 +0000 UTC m=+1508.458689120" lastFinishedPulling="2025-09-30 17:26:50.065670384 +0000 UTC m=+1510.972683215" observedRunningTime="2025-09-30 17:26:50.611389567 +0000 UTC m=+1511.518402398" watchObservedRunningTime="2025-09-30 17:26:50.619384049 +0000 UTC m=+1511.526396880" Sep 30 17:26:55 crc kubenswrapper[4772]: I0930 17:26:55.841229 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 17:26:56 crc kubenswrapper[4772]: I0930 17:26:56.125199 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:56 crc kubenswrapper[4772]: I0930 17:26:56.125651 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:56 crc kubenswrapper[4772]: I0930 17:26:56.204850 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:56 crc kubenswrapper[4772]: I0930 17:26:56.755992 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:56 crc kubenswrapper[4772]: I0930 17:26:56.840790 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzblg"] Sep 30 17:26:57 crc kubenswrapper[4772]: I0930 17:26:57.051310 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:26:58 crc kubenswrapper[4772]: I0930 17:26:58.686917 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wzblg" podUID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerName="registry-server" containerID="cri-o://f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0" gracePeriod=2 Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.162195 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.276989 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-utilities\") pod \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.277203 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-catalog-content\") pod \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.277526 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wttf7\" (UniqueName: \"kubernetes.io/projected/f178a32d-b123-422b-bdb7-fbdc0be3a62d-kube-api-access-wttf7\") pod \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\" (UID: \"f178a32d-b123-422b-bdb7-fbdc0be3a62d\") " Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.278492 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-utilities" (OuterVolumeSpecName: "utilities") pod "f178a32d-b123-422b-bdb7-fbdc0be3a62d" (UID: "f178a32d-b123-422b-bdb7-fbdc0be3a62d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.288731 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f178a32d-b123-422b-bdb7-fbdc0be3a62d-kube-api-access-wttf7" (OuterVolumeSpecName: "kube-api-access-wttf7") pod "f178a32d-b123-422b-bdb7-fbdc0be3a62d" (UID: "f178a32d-b123-422b-bdb7-fbdc0be3a62d"). InnerVolumeSpecName "kube-api-access-wttf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.290430 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f178a32d-b123-422b-bdb7-fbdc0be3a62d" (UID: "f178a32d-b123-422b-bdb7-fbdc0be3a62d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.379826 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.379857 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wttf7\" (UniqueName: \"kubernetes.io/projected/f178a32d-b123-422b-bdb7-fbdc0be3a62d-kube-api-access-wttf7\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.379869 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f178a32d-b123-422b-bdb7-fbdc0be3a62d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.698489 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" event={"ID":"addb4fb1-d812-4472-a08e-742c97c9b6d2","Type":"ContainerStarted","Data":"744226b27b19ad7d102f4a6fdeaf2d1472442ffde967e3f04a2c8fdc189a37cc"} Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.706862 4772 generic.go:334] "Generic (PLEG): container finished" podID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerID="f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0" exitCode=0 Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.706924 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzblg" event={"ID":"f178a32d-b123-422b-bdb7-fbdc0be3a62d","Type":"ContainerDied","Data":"f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0"} Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.706966 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzblg" event={"ID":"f178a32d-b123-422b-bdb7-fbdc0be3a62d","Type":"ContainerDied","Data":"ad82194c03226af6c6d7cf1aebda990ebf55afd5e13ef83608dcbf4e54a15437"} Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.706993 4772 scope.go:117] "RemoveContainer" containerID="f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.707205 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzblg" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.735482 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" podStartSLOduration=2.8290796289999998 podStartE2EDuration="12.735462018s" podCreationTimestamp="2025-09-30 17:26:47 +0000 UTC" firstStartedPulling="2025-09-30 17:26:48.603181373 +0000 UTC m=+1509.510194204" lastFinishedPulling="2025-09-30 17:26:58.509563762 +0000 UTC m=+1519.416576593" observedRunningTime="2025-09-30 17:26:59.72084066 +0000 UTC m=+1520.627853491" watchObservedRunningTime="2025-09-30 17:26:59.735462018 +0000 UTC m=+1520.642474849" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.742098 4772 scope.go:117] "RemoveContainer" containerID="9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.771099 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzblg"] Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.775472 4772 scope.go:117] "RemoveContainer" containerID="59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.779372 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzblg"] Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.816929 4772 scope.go:117] "RemoveContainer" containerID="f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0" Sep 30 17:26:59 crc kubenswrapper[4772]: E0930 17:26:59.817421 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0\": container with ID starting with f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0 not found: ID does not exist" containerID="f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.817477 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0"} err="failed to get container status \"f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0\": rpc error: code = NotFound desc = could not find container \"f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0\": container with ID starting with f97d0644be2ad6962e74f2f1045d10cea94e1d711ac5105b61f22f34296e17a0 not found: ID does not exist" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.817510 4772 scope.go:117] "RemoveContainer" containerID="9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973" Sep 30 17:26:59 crc kubenswrapper[4772]: E0930 17:26:59.817872 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973\": container with ID starting with 9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973 not found: ID does not exist" containerID="9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.817902 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973"} err="failed to get container status \"9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973\": rpc error: code = NotFound desc = could not find container \"9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973\": container with ID starting with 9f5aa632b26fb2c4d3609fa0c67837aa2fb80c3f02e714500281d31a437a2973 not found: ID does not exist" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.817918 4772 scope.go:117] "RemoveContainer" containerID="59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8" Sep 30 17:26:59 crc kubenswrapper[4772]: E0930 17:26:59.818333 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8\": container with ID starting with 59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8 not found: ID does not exist" containerID="59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.818365 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8"} err="failed to get container status \"59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8\": rpc error: code = NotFound desc = could not find container \"59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8\": container with ID starting with 59ff0829df6f598843c1dbdaaccaf88d26f5a950d860c80a27ea7bf5e3e46aa8 not found: ID does not exist" Sep 30 17:26:59 crc kubenswrapper[4772]: E0930 17:26:59.897942 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf178a32d_b123_422b_bdb7_fbdc0be3a62d.slice\": RecentStats: unable to find data in memory cache]" Sep 30 17:26:59 crc kubenswrapper[4772]: I0930 17:26:59.914989 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" path="/var/lib/kubelet/pods/f178a32d-b123-422b-bdb7-fbdc0be3a62d/volumes" Sep 30 17:27:11 crc kubenswrapper[4772]: I0930 17:27:11.841306 4772 generic.go:334] "Generic (PLEG): container finished" podID="addb4fb1-d812-4472-a08e-742c97c9b6d2" containerID="744226b27b19ad7d102f4a6fdeaf2d1472442ffde967e3f04a2c8fdc189a37cc" exitCode=0 Sep 30 17:27:11 crc kubenswrapper[4772]: I0930 17:27:11.841439 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" event={"ID":"addb4fb1-d812-4472-a08e-742c97c9b6d2","Type":"ContainerDied","Data":"744226b27b19ad7d102f4a6fdeaf2d1472442ffde967e3f04a2c8fdc189a37cc"} Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.289007 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.362923 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-inventory\") pod \"addb4fb1-d812-4472-a08e-742c97c9b6d2\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.363027 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7b7w\" (UniqueName: \"kubernetes.io/projected/addb4fb1-d812-4472-a08e-742c97c9b6d2-kube-api-access-j7b7w\") pod \"addb4fb1-d812-4472-a08e-742c97c9b6d2\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.363073 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-repo-setup-combined-ca-bundle\") pod \"addb4fb1-d812-4472-a08e-742c97c9b6d2\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.363100 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-ssh-key\") pod \"addb4fb1-d812-4472-a08e-742c97c9b6d2\" (UID: \"addb4fb1-d812-4472-a08e-742c97c9b6d2\") " Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.369371 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/addb4fb1-d812-4472-a08e-742c97c9b6d2-kube-api-access-j7b7w" (OuterVolumeSpecName: "kube-api-access-j7b7w") pod "addb4fb1-d812-4472-a08e-742c97c9b6d2" (UID: "addb4fb1-d812-4472-a08e-742c97c9b6d2"). InnerVolumeSpecName "kube-api-access-j7b7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.375251 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "addb4fb1-d812-4472-a08e-742c97c9b6d2" (UID: "addb4fb1-d812-4472-a08e-742c97c9b6d2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.390665 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-inventory" (OuterVolumeSpecName: "inventory") pod "addb4fb1-d812-4472-a08e-742c97c9b6d2" (UID: "addb4fb1-d812-4472-a08e-742c97c9b6d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.391718 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "addb4fb1-d812-4472-a08e-742c97c9b6d2" (UID: "addb4fb1-d812-4472-a08e-742c97c9b6d2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.466562 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.466611 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7b7w\" (UniqueName: \"kubernetes.io/projected/addb4fb1-d812-4472-a08e-742c97c9b6d2-kube-api-access-j7b7w\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.466623 4772 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.466632 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/addb4fb1-d812-4472-a08e-742c97c9b6d2-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.868018 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" event={"ID":"addb4fb1-d812-4472-a08e-742c97c9b6d2","Type":"ContainerDied","Data":"02d0b3cfd79caba9d47cc25ae0cb968cbab531fafcc2e0a907bc151e0b1fcd6c"} Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.868079 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d0b3cfd79caba9d47cc25ae0cb968cbab531fafcc2e0a907bc151e0b1fcd6c" Sep 30 17:27:13 crc kubenswrapper[4772]: I0930 17:27:13.868102 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.004375 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px"] Sep 30 17:27:14 crc kubenswrapper[4772]: E0930 17:27:14.004784 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="addb4fb1-d812-4472-a08e-742c97c9b6d2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.004802 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="addb4fb1-d812-4472-a08e-742c97c9b6d2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 17:27:14 crc kubenswrapper[4772]: E0930 17:27:14.004821 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerName="registry-server" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.004828 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerName="registry-server" Sep 30 17:27:14 crc kubenswrapper[4772]: E0930 17:27:14.004877 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerName="extract-utilities" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.004883 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerName="extract-utilities" Sep 30 17:27:14 crc kubenswrapper[4772]: E0930 17:27:14.004898 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerName="extract-content" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.004904 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerName="extract-content" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.006097 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f178a32d-b123-422b-bdb7-fbdc0be3a62d" containerName="registry-server" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.006128 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="addb4fb1-d812-4472-a08e-742c97c9b6d2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.006876 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.012793 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.012842 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.013352 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.016386 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.031339 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px"] Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.078579 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.078690 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2b2\" (UniqueName: \"kubernetes.io/projected/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-kube-api-access-tl2b2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.078729 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.078806 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.180954 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2b2\" (UniqueName: \"kubernetes.io/projected/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-kube-api-access-tl2b2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.181026 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.181115 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.181193 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.185822 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.195190 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.198739 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.201445 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2b2\" (UniqueName: \"kubernetes.io/projected/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-kube-api-access-tl2b2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.333302 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.860294 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px"] Sep 30 17:27:14 crc kubenswrapper[4772]: I0930 17:27:14.880937 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" event={"ID":"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0","Type":"ContainerStarted","Data":"db40911785b3b4e5417e5ab50af1674b5ea17c09335dec409771e5c58426d517"} Sep 30 17:27:15 crc kubenswrapper[4772]: I0930 17:27:15.894268 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" event={"ID":"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0","Type":"ContainerStarted","Data":"9b6432f3ad15c8e9f97708735999655110f936e734697dcf5b94d4d6cb59a4ae"} Sep 30 17:27:15 crc kubenswrapper[4772]: I0930 17:27:15.912169 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" podStartSLOduration=2.433903467 podStartE2EDuration="2.912148308s" podCreationTimestamp="2025-09-30 17:27:13 +0000 UTC" firstStartedPulling="2025-09-30 17:27:14.865715337 +0000 UTC m=+1535.772728168" lastFinishedPulling="2025-09-30 17:27:15.343960158 +0000 UTC m=+1536.250973009" observedRunningTime="2025-09-30 17:27:15.908256935 +0000 UTC m=+1536.815269766" watchObservedRunningTime="2025-09-30 17:27:15.912148308 +0000 UTC m=+1536.819161139" Sep 30 17:27:38 crc kubenswrapper[4772]: I0930 17:27:38.655116 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:27:38 crc kubenswrapper[4772]: I0930 17:27:38.655689 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:28:08 crc kubenswrapper[4772]: I0930 17:28:08.655292 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:28:08 crc kubenswrapper[4772]: I0930 17:28:08.655829 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:28:38 crc kubenswrapper[4772]: I0930 17:28:38.655844 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:28:38 crc kubenswrapper[4772]: I0930 17:28:38.656473 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:28:38 crc kubenswrapper[4772]: I0930 17:28:38.656521 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:28:38 crc kubenswrapper[4772]: I0930 17:28:38.657355 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:28:38 crc kubenswrapper[4772]: I0930 17:28:38.657415 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" gracePeriod=600 Sep 30 17:28:38 crc kubenswrapper[4772]: E0930 17:28:38.788618 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:28:39 crc kubenswrapper[4772]: I0930 17:28:39.684037 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" exitCode=0 Sep 30 17:28:39 crc kubenswrapper[4772]: I0930 17:28:39.684093 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef"} Sep 30 17:28:39 crc kubenswrapper[4772]: I0930 17:28:39.684472 4772 scope.go:117] "RemoveContainer" containerID="0167984dc474e8f0e251ca86d3847ef4b3ab076e2cb16fe9125a3f852650eb68" Sep 30 17:28:39 crc kubenswrapper[4772]: I0930 17:28:39.685209 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:28:39 crc kubenswrapper[4772]: E0930 17:28:39.685463 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:28:50 crc kubenswrapper[4772]: I0930 17:28:50.898581 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:28:50 crc kubenswrapper[4772]: E0930 17:28:50.899703 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:29:02 crc kubenswrapper[4772]: I0930 17:29:02.897922 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:29:02 crc kubenswrapper[4772]: E0930 17:29:02.898617 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:29:15 crc kubenswrapper[4772]: I0930 17:29:15.898450 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:29:15 crc kubenswrapper[4772]: E0930 17:29:15.899501 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:29:27 crc kubenswrapper[4772]: I0930 17:29:27.898249 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:29:27 crc kubenswrapper[4772]: E0930 17:29:27.899071 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:29:41 crc kubenswrapper[4772]: I0930 17:29:41.898119 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:29:41 crc kubenswrapper[4772]: E0930 17:29:41.898831 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:29:53 crc kubenswrapper[4772]: I0930 17:29:53.897919 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:29:53 crc kubenswrapper[4772]: E0930 17:29:53.898699 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.050481 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jw52p"] Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.062937 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dqw5r"] Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.071215 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-srdzt"] Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.078838 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jw52p"] Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.086634 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-srdzt"] Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.093941 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dqw5r"] Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.152038 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-khrv4"] Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.154900 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.173024 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khrv4"] Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.182030 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-utilities\") pod \"redhat-operators-khrv4\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.182488 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-catalog-content\") pod \"redhat-operators-khrv4\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.182860 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mxhg\" (UniqueName: \"kubernetes.io/projected/c33a8a57-808c-4d51-b65e-54ba9977845b-kube-api-access-4mxhg\") pod \"redhat-operators-khrv4\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.285449 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-utilities\") pod \"redhat-operators-khrv4\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.285581 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-catalog-content\") pod \"redhat-operators-khrv4\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.285667 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mxhg\" (UniqueName: \"kubernetes.io/projected/c33a8a57-808c-4d51-b65e-54ba9977845b-kube-api-access-4mxhg\") pod \"redhat-operators-khrv4\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.286263 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-utilities\") pod \"redhat-operators-khrv4\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.286307 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-catalog-content\") pod \"redhat-operators-khrv4\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.308907 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mxhg\" (UniqueName: \"kubernetes.io/projected/c33a8a57-808c-4d51-b65e-54ba9977845b-kube-api-access-4mxhg\") pod \"redhat-operators-khrv4\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:58 crc kubenswrapper[4772]: I0930 17:29:58.500740 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:29:59 crc kubenswrapper[4772]: I0930 17:29:59.012583 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khrv4"] Sep 30 17:29:59 crc kubenswrapper[4772]: I0930 17:29:59.488870 4772 generic.go:334] "Generic (PLEG): container finished" podID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerID="fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8" exitCode=0 Sep 30 17:29:59 crc kubenswrapper[4772]: I0930 17:29:59.488952 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrv4" event={"ID":"c33a8a57-808c-4d51-b65e-54ba9977845b","Type":"ContainerDied","Data":"fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8"} Sep 30 17:29:59 crc kubenswrapper[4772]: I0930 17:29:59.489387 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrv4" event={"ID":"c33a8a57-808c-4d51-b65e-54ba9977845b","Type":"ContainerStarted","Data":"e730070928b4b915a1c19a3ad27ab64e5a97b927568f2ca709e28d89cdf8f7fa"} Sep 30 17:29:59 crc kubenswrapper[4772]: I0930 17:29:59.491326 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:29:59 crc kubenswrapper[4772]: I0930 17:29:59.914593 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="950feb31-d399-4491-a4e4-365371d0d2b6" path="/var/lib/kubelet/pods/950feb31-d399-4491-a4e4-365371d0d2b6/volumes" Sep 30 17:29:59 crc kubenswrapper[4772]: I0930 17:29:59.916037 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca281cc-c87b-4f62-8d9c-12373a1dc085" path="/var/lib/kubelet/pods/9ca281cc-c87b-4f62-8d9c-12373a1dc085/volumes" Sep 30 17:29:59 crc kubenswrapper[4772]: I0930 17:29:59.917107 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2815398-98f4-48d9-9e2a-54f25ac3fd0c" path="/var/lib/kubelet/pods/e2815398-98f4-48d9-9e2a-54f25ac3fd0c/volumes" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.036784 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-lzvcs"] Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.046168 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-lzvcs"] Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.158539 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq"] Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.161754 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.165556 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.165736 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.175319 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq"] Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.233603 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c870583b-ddcc-4939-94ae-f192d0ed0f2b-secret-volume\") pod \"collect-profiles-29320890-vtqmq\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.233778 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c870583b-ddcc-4939-94ae-f192d0ed0f2b-config-volume\") pod \"collect-profiles-29320890-vtqmq\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.234026 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh6kj\" (UniqueName: \"kubernetes.io/projected/c870583b-ddcc-4939-94ae-f192d0ed0f2b-kube-api-access-jh6kj\") pod \"collect-profiles-29320890-vtqmq\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.336299 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh6kj\" (UniqueName: \"kubernetes.io/projected/c870583b-ddcc-4939-94ae-f192d0ed0f2b-kube-api-access-jh6kj\") pod \"collect-profiles-29320890-vtqmq\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.336380 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c870583b-ddcc-4939-94ae-f192d0ed0f2b-secret-volume\") pod \"collect-profiles-29320890-vtqmq\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.336506 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c870583b-ddcc-4939-94ae-f192d0ed0f2b-config-volume\") pod \"collect-profiles-29320890-vtqmq\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.337487 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c870583b-ddcc-4939-94ae-f192d0ed0f2b-config-volume\") pod \"collect-profiles-29320890-vtqmq\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.343128 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c870583b-ddcc-4939-94ae-f192d0ed0f2b-secret-volume\") pod \"collect-profiles-29320890-vtqmq\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.361652 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh6kj\" (UniqueName: \"kubernetes.io/projected/c870583b-ddcc-4939-94ae-f192d0ed0f2b-kube-api-access-jh6kj\") pod \"collect-profiles-29320890-vtqmq\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:00 crc kubenswrapper[4772]: I0930 17:30:00.525458 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:01 crc kubenswrapper[4772]: W0930 17:30:01.051707 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc870583b_ddcc_4939_94ae_f192d0ed0f2b.slice/crio-51db0837a090c58767058effdcb394b6ab47977b8eb8524479826968c2350412 WatchSource:0}: Error finding container 51db0837a090c58767058effdcb394b6ab47977b8eb8524479826968c2350412: Status 404 returned error can't find the container with id 51db0837a090c58767058effdcb394b6ab47977b8eb8524479826968c2350412 Sep 30 17:30:01 crc kubenswrapper[4772]: I0930 17:30:01.054109 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq"] Sep 30 17:30:01 crc kubenswrapper[4772]: I0930 17:30:01.526445 4772 generic.go:334] "Generic (PLEG): container finished" podID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerID="838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b" exitCode=0 Sep 30 17:30:01 crc kubenswrapper[4772]: I0930 17:30:01.526618 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrv4" event={"ID":"c33a8a57-808c-4d51-b65e-54ba9977845b","Type":"ContainerDied","Data":"838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b"} Sep 30 17:30:01 crc kubenswrapper[4772]: I0930 17:30:01.530807 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" event={"ID":"c870583b-ddcc-4939-94ae-f192d0ed0f2b","Type":"ContainerStarted","Data":"ae2113f964fc5355023f4ab0ac80219cbe6afa876d32608300b0a44cfaaf71b7"} Sep 30 17:30:01 crc kubenswrapper[4772]: I0930 17:30:01.530898 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" event={"ID":"c870583b-ddcc-4939-94ae-f192d0ed0f2b","Type":"ContainerStarted","Data":"51db0837a090c58767058effdcb394b6ab47977b8eb8524479826968c2350412"} Sep 30 17:30:01 crc kubenswrapper[4772]: I0930 17:30:01.575615 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" podStartSLOduration=1.575588954 podStartE2EDuration="1.575588954s" podCreationTimestamp="2025-09-30 17:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:30:01.571385745 +0000 UTC m=+1702.478398776" watchObservedRunningTime="2025-09-30 17:30:01.575588954 +0000 UTC m=+1702.482601785" Sep 30 17:30:01 crc kubenswrapper[4772]: I0930 17:30:01.912777 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b10070-728b-4128-8faa-10b567e20342" path="/var/lib/kubelet/pods/09b10070-728b-4128-8faa-10b567e20342/volumes" Sep 30 17:30:02 crc kubenswrapper[4772]: I0930 17:30:02.543113 4772 generic.go:334] "Generic (PLEG): container finished" podID="c870583b-ddcc-4939-94ae-f192d0ed0f2b" containerID="ae2113f964fc5355023f4ab0ac80219cbe6afa876d32608300b0a44cfaaf71b7" exitCode=0 Sep 30 17:30:02 crc kubenswrapper[4772]: I0930 17:30:02.543582 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" event={"ID":"c870583b-ddcc-4939-94ae-f192d0ed0f2b","Type":"ContainerDied","Data":"ae2113f964fc5355023f4ab0ac80219cbe6afa876d32608300b0a44cfaaf71b7"} Sep 30 17:30:02 crc kubenswrapper[4772]: I0930 17:30:02.551616 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrv4" event={"ID":"c33a8a57-808c-4d51-b65e-54ba9977845b","Type":"ContainerStarted","Data":"d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa"} Sep 30 17:30:02 crc kubenswrapper[4772]: I0930 17:30:02.593133 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-khrv4" podStartSLOduration=1.822395097 podStartE2EDuration="4.593106382s" podCreationTimestamp="2025-09-30 17:29:58 +0000 UTC" firstStartedPulling="2025-09-30 17:29:59.490984233 +0000 UTC m=+1700.397997064" lastFinishedPulling="2025-09-30 17:30:02.261695518 +0000 UTC m=+1703.168708349" observedRunningTime="2025-09-30 17:30:02.586156552 +0000 UTC m=+1703.493169393" watchObservedRunningTime="2025-09-30 17:30:02.593106382 +0000 UTC m=+1703.500119213" Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.005936 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.037302 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c870583b-ddcc-4939-94ae-f192d0ed0f2b-config-volume\") pod \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.037472 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c870583b-ddcc-4939-94ae-f192d0ed0f2b-secret-volume\") pod \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.037578 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh6kj\" (UniqueName: \"kubernetes.io/projected/c870583b-ddcc-4939-94ae-f192d0ed0f2b-kube-api-access-jh6kj\") pod \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\" (UID: \"c870583b-ddcc-4939-94ae-f192d0ed0f2b\") " Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.038557 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c870583b-ddcc-4939-94ae-f192d0ed0f2b-config-volume" (OuterVolumeSpecName: "config-volume") pod "c870583b-ddcc-4939-94ae-f192d0ed0f2b" (UID: "c870583b-ddcc-4939-94ae-f192d0ed0f2b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.050948 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c870583b-ddcc-4939-94ae-f192d0ed0f2b-kube-api-access-jh6kj" (OuterVolumeSpecName: "kube-api-access-jh6kj") pod "c870583b-ddcc-4939-94ae-f192d0ed0f2b" (UID: "c870583b-ddcc-4939-94ae-f192d0ed0f2b"). InnerVolumeSpecName "kube-api-access-jh6kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.053532 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c870583b-ddcc-4939-94ae-f192d0ed0f2b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c870583b-ddcc-4939-94ae-f192d0ed0f2b" (UID: "c870583b-ddcc-4939-94ae-f192d0ed0f2b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.140754 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c870583b-ddcc-4939-94ae-f192d0ed0f2b-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.140797 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c870583b-ddcc-4939-94ae-f192d0ed0f2b-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.140808 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh6kj\" (UniqueName: \"kubernetes.io/projected/c870583b-ddcc-4939-94ae-f192d0ed0f2b-kube-api-access-jh6kj\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.571374 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" event={"ID":"c870583b-ddcc-4939-94ae-f192d0ed0f2b","Type":"ContainerDied","Data":"51db0837a090c58767058effdcb394b6ab47977b8eb8524479826968c2350412"} Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.571678 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51db0837a090c58767058effdcb394b6ab47977b8eb8524479826968c2350412" Sep 30 17:30:04 crc kubenswrapper[4772]: I0930 17:30:04.571424 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq" Sep 30 17:30:05 crc kubenswrapper[4772]: I0930 17:30:05.038330 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1272-account-create-42x7w"] Sep 30 17:30:05 crc kubenswrapper[4772]: I0930 17:30:05.052375 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9da5-account-create-dprs7"] Sep 30 17:30:05 crc kubenswrapper[4772]: I0930 17:30:05.061452 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9da5-account-create-dprs7"] Sep 30 17:30:05 crc kubenswrapper[4772]: I0930 17:30:05.069350 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1272-account-create-42x7w"] Sep 30 17:30:05 crc kubenswrapper[4772]: I0930 17:30:05.916282 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346d6dc6-45fc-4534-848a-181c95a3c790" path="/var/lib/kubelet/pods/346d6dc6-45fc-4534-848a-181c95a3c790/volumes" Sep 30 17:30:05 crc kubenswrapper[4772]: I0930 17:30:05.916969 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f847bcf-eda9-4d03-8b13-c7688bdeaf31" path="/var/lib/kubelet/pods/4f847bcf-eda9-4d03-8b13-c7688bdeaf31/volumes" Sep 30 17:30:07 crc kubenswrapper[4772]: I0930 17:30:07.899464 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:30:07 crc kubenswrapper[4772]: E0930 17:30:07.900217 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:30:08 crc kubenswrapper[4772]: I0930 17:30:08.501233 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:30:08 crc kubenswrapper[4772]: I0930 17:30:08.501322 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:30:08 crc kubenswrapper[4772]: I0930 17:30:08.553214 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:30:08 crc kubenswrapper[4772]: I0930 17:30:08.664624 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:30:09 crc kubenswrapper[4772]: I0930 17:30:09.726588 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khrv4"] Sep 30 17:30:10 crc kubenswrapper[4772]: I0930 17:30:10.639617 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-khrv4" podUID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerName="registry-server" containerID="cri-o://d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa" gracePeriod=2 Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.135876 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.224993 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-utilities\") pod \"c33a8a57-808c-4d51-b65e-54ba9977845b\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.226260 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mxhg\" (UniqueName: \"kubernetes.io/projected/c33a8a57-808c-4d51-b65e-54ba9977845b-kube-api-access-4mxhg\") pod \"c33a8a57-808c-4d51-b65e-54ba9977845b\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.226499 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-catalog-content\") pod \"c33a8a57-808c-4d51-b65e-54ba9977845b\" (UID: \"c33a8a57-808c-4d51-b65e-54ba9977845b\") " Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.226254 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-utilities" (OuterVolumeSpecName: "utilities") pod "c33a8a57-808c-4d51-b65e-54ba9977845b" (UID: "c33a8a57-808c-4d51-b65e-54ba9977845b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.230139 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.232518 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33a8a57-808c-4d51-b65e-54ba9977845b-kube-api-access-4mxhg" (OuterVolumeSpecName: "kube-api-access-4mxhg") pod "c33a8a57-808c-4d51-b65e-54ba9977845b" (UID: "c33a8a57-808c-4d51-b65e-54ba9977845b"). InnerVolumeSpecName "kube-api-access-4mxhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.330609 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c33a8a57-808c-4d51-b65e-54ba9977845b" (UID: "c33a8a57-808c-4d51-b65e-54ba9977845b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.332653 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33a8a57-808c-4d51-b65e-54ba9977845b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.332698 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mxhg\" (UniqueName: \"kubernetes.io/projected/c33a8a57-808c-4d51-b65e-54ba9977845b-kube-api-access-4mxhg\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.652412 4772 generic.go:334] "Generic (PLEG): container finished" podID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerID="d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa" exitCode=0 Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.652498 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrv4" event={"ID":"c33a8a57-808c-4d51-b65e-54ba9977845b","Type":"ContainerDied","Data":"d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa"} Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.653125 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrv4" event={"ID":"c33a8a57-808c-4d51-b65e-54ba9977845b","Type":"ContainerDied","Data":"e730070928b4b915a1c19a3ad27ab64e5a97b927568f2ca709e28d89cdf8f7fa"} Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.652575 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khrv4" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.653204 4772 scope.go:117] "RemoveContainer" containerID="d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.700838 4772 scope.go:117] "RemoveContainer" containerID="838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.703398 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khrv4"] Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.715453 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-khrv4"] Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.736116 4772 scope.go:117] "RemoveContainer" containerID="fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.780563 4772 scope.go:117] "RemoveContainer" containerID="d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa" Sep 30 17:30:11 crc kubenswrapper[4772]: E0930 17:30:11.781373 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa\": container with ID starting with d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa not found: ID does not exist" containerID="d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.781684 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa"} err="failed to get container status \"d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa\": rpc error: code = NotFound desc = could not find container \"d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa\": container with ID starting with d73d74e9575cdcbe70e0d116d6e75612de8a160bf688600cb1f71e2053250faa not found: ID does not exist" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.781774 4772 scope.go:117] "RemoveContainer" containerID="838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b" Sep 30 17:30:11 crc kubenswrapper[4772]: E0930 17:30:11.782225 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b\": container with ID starting with 838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b not found: ID does not exist" containerID="838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.782338 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b"} err="failed to get container status \"838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b\": rpc error: code = NotFound desc = could not find container \"838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b\": container with ID starting with 838516ce5fe687de92bdd5950245acbbffe6aecb4365b6c5e2f7db062657e80b not found: ID does not exist" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.782433 4772 scope.go:117] "RemoveContainer" containerID="fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8" Sep 30 17:30:11 crc kubenswrapper[4772]: E0930 17:30:11.782772 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8\": container with ID starting with fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8 not found: ID does not exist" containerID="fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.782869 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8"} err="failed to get container status \"fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8\": rpc error: code = NotFound desc = could not find container \"fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8\": container with ID starting with fac22b51d004dc5b66ad468c34800a65e67389a4f1c9de388a80bece643517c8 not found: ID does not exist" Sep 30 17:30:11 crc kubenswrapper[4772]: I0930 17:30:11.913427 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c33a8a57-808c-4d51-b65e-54ba9977845b" path="/var/lib/kubelet/pods/c33a8a57-808c-4d51-b65e-54ba9977845b/volumes" Sep 30 17:30:14 crc kubenswrapper[4772]: I0930 17:30:14.044076 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bad6-account-create-ns8k7"] Sep 30 17:30:14 crc kubenswrapper[4772]: I0930 17:30:14.053964 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bad6-account-create-ns8k7"] Sep 30 17:30:15 crc kubenswrapper[4772]: I0930 17:30:15.910630 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aabbe70a-9741-4424-ae33-ed237e64a54b" path="/var/lib/kubelet/pods/aabbe70a-9741-4424-ae33-ed237e64a54b/volumes" Sep 30 17:30:19 crc kubenswrapper[4772]: I0930 17:30:19.904987 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:30:19 crc kubenswrapper[4772]: E0930 17:30:19.907373 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:30:24 crc kubenswrapper[4772]: I0930 17:30:24.037735 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-3d9f-account-create-84lzl"] Sep 30 17:30:24 crc kubenswrapper[4772]: I0930 17:30:24.049450 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-3d9f-account-create-84lzl"] Sep 30 17:30:25 crc kubenswrapper[4772]: I0930 17:30:25.908359 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143dbcfd-e0db-419c-aa5e-65ad9174de1e" path="/var/lib/kubelet/pods/143dbcfd-e0db-419c-aa5e-65ad9174de1e/volumes" Sep 30 17:30:26 crc kubenswrapper[4772]: I0930 17:30:26.822251 4772 generic.go:334] "Generic (PLEG): container finished" podID="b29acfcb-d20b-4be6-a22b-e0e0bc5deae0" containerID="9b6432f3ad15c8e9f97708735999655110f936e734697dcf5b94d4d6cb59a4ae" exitCode=0 Sep 30 17:30:26 crc kubenswrapper[4772]: I0930 17:30:26.822318 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" event={"ID":"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0","Type":"ContainerDied","Data":"9b6432f3ad15c8e9f97708735999655110f936e734697dcf5b94d4d6cb59a4ae"} Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.341908 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.457544 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-bootstrap-combined-ca-bundle\") pod \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.458037 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-inventory\") pod \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.458518 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2b2\" (UniqueName: \"kubernetes.io/projected/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-kube-api-access-tl2b2\") pod \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.458612 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-ssh-key\") pod \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\" (UID: \"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0\") " Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.465518 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-kube-api-access-tl2b2" (OuterVolumeSpecName: "kube-api-access-tl2b2") pod "b29acfcb-d20b-4be6-a22b-e0e0bc5deae0" (UID: "b29acfcb-d20b-4be6-a22b-e0e0bc5deae0"). InnerVolumeSpecName "kube-api-access-tl2b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.475338 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b29acfcb-d20b-4be6-a22b-e0e0bc5deae0" (UID: "b29acfcb-d20b-4be6-a22b-e0e0bc5deae0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.486659 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-inventory" (OuterVolumeSpecName: "inventory") pod "b29acfcb-d20b-4be6-a22b-e0e0bc5deae0" (UID: "b29acfcb-d20b-4be6-a22b-e0e0bc5deae0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.490746 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b29acfcb-d20b-4be6-a22b-e0e0bc5deae0" (UID: "b29acfcb-d20b-4be6-a22b-e0e0bc5deae0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.560896 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl2b2\" (UniqueName: \"kubernetes.io/projected/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-kube-api-access-tl2b2\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.560931 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.560940 4772 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.560949 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.851848 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" event={"ID":"b29acfcb-d20b-4be6-a22b-e0e0bc5deae0","Type":"ContainerDied","Data":"db40911785b3b4e5417e5ab50af1674b5ea17c09335dec409771e5c58426d517"} Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.851892 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db40911785b3b4e5417e5ab50af1674b5ea17c09335dec409771e5c58426d517" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.851921 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.927016 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf"] Sep 30 17:30:28 crc kubenswrapper[4772]: E0930 17:30:28.927444 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerName="extract-content" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.927462 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerName="extract-content" Sep 30 17:30:28 crc kubenswrapper[4772]: E0930 17:30:28.927483 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerName="registry-server" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.927490 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerName="registry-server" Sep 30 17:30:28 crc kubenswrapper[4772]: E0930 17:30:28.927503 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c870583b-ddcc-4939-94ae-f192d0ed0f2b" containerName="collect-profiles" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.927508 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c870583b-ddcc-4939-94ae-f192d0ed0f2b" containerName="collect-profiles" Sep 30 17:30:28 crc kubenswrapper[4772]: E0930 17:30:28.927525 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerName="extract-utilities" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.927531 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerName="extract-utilities" Sep 30 17:30:28 crc kubenswrapper[4772]: E0930 17:30:28.927555 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29acfcb-d20b-4be6-a22b-e0e0bc5deae0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.927562 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29acfcb-d20b-4be6-a22b-e0e0bc5deae0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.927749 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b29acfcb-d20b-4be6-a22b-e0e0bc5deae0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.927769 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c870583b-ddcc-4939-94ae-f192d0ed0f2b" containerName="collect-profiles" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.927784 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c33a8a57-808c-4d51-b65e-54ba9977845b" containerName="registry-server" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.928813 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.935641 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.935919 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.936071 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.936169 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:30:28 crc kubenswrapper[4772]: I0930 17:30:28.955013 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf"] Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.080701 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2pgw\" (UniqueName: \"kubernetes.io/projected/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-kube-api-access-g2pgw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bplkf\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.080903 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bplkf\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.081016 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bplkf\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.183884 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bplkf\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.184129 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bplkf\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.184270 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2pgw\" (UniqueName: \"kubernetes.io/projected/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-kube-api-access-g2pgw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bplkf\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.189842 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bplkf\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.190486 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bplkf\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.203453 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2pgw\" (UniqueName: \"kubernetes.io/projected/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-kube-api-access-g2pgw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bplkf\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.254905 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.782606 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf"] Sep 30 17:30:29 crc kubenswrapper[4772]: I0930 17:30:29.863506 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" event={"ID":"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9","Type":"ContainerStarted","Data":"6a63ef02974abd4263fbdc701ce0a158664d1a55848bf39f7e113cd327d63c64"} Sep 30 17:30:31 crc kubenswrapper[4772]: I0930 17:30:31.885677 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" event={"ID":"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9","Type":"ContainerStarted","Data":"2168037135d0ef970a9fc055a874188222affce2f41bb707b5d1eff228a56234"} Sep 30 17:30:31 crc kubenswrapper[4772]: I0930 17:30:31.902572 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" podStartSLOduration=2.736099152 podStartE2EDuration="3.902554996s" podCreationTimestamp="2025-09-30 17:30:28 +0000 UTC" firstStartedPulling="2025-09-30 17:30:29.798482279 +0000 UTC m=+1730.705495110" lastFinishedPulling="2025-09-30 17:30:30.964938123 +0000 UTC m=+1731.871950954" observedRunningTime="2025-09-30 17:30:31.901851177 +0000 UTC m=+1732.808864008" watchObservedRunningTime="2025-09-30 17:30:31.902554996 +0000 UTC m=+1732.809567827" Sep 30 17:30:32 crc kubenswrapper[4772]: I0930 17:30:32.077696 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-76fwc"] Sep 30 17:30:32 crc kubenswrapper[4772]: I0930 17:30:32.091634 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-76fwc"] Sep 30 17:30:33 crc kubenswrapper[4772]: I0930 17:30:33.036767 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bpsgs"] Sep 30 17:30:33 crc kubenswrapper[4772]: I0930 17:30:33.050819 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8jz82"] Sep 30 17:30:33 crc kubenswrapper[4772]: I0930 17:30:33.059675 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8jz82"] Sep 30 17:30:33 crc kubenswrapper[4772]: I0930 17:30:33.071852 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bpsgs"] Sep 30 17:30:33 crc kubenswrapper[4772]: I0930 17:30:33.912444 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ffc8f04-1cf1-4653-925c-781c84770099" path="/var/lib/kubelet/pods/1ffc8f04-1cf1-4653-925c-781c84770099/volumes" Sep 30 17:30:33 crc kubenswrapper[4772]: I0930 17:30:33.913091 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44" path="/var/lib/kubelet/pods/a6cb11fd-200b-44d7-a29f-f9ebbdbc2d44/volumes" Sep 30 17:30:33 crc kubenswrapper[4772]: I0930 17:30:33.913711 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d32d405b-efef-42de-a2c0-7c047dbcbec3" path="/var/lib/kubelet/pods/d32d405b-efef-42de-a2c0-7c047dbcbec3/volumes" Sep 30 17:30:34 crc kubenswrapper[4772]: I0930 17:30:34.899700 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:30:34 crc kubenswrapper[4772]: E0930 17:30:34.900120 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:30:38 crc kubenswrapper[4772]: I0930 17:30:38.039191 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-tmd5b"] Sep 30 17:30:38 crc kubenswrapper[4772]: I0930 17:30:38.048036 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-tmd5b"] Sep 30 17:30:39 crc kubenswrapper[4772]: I0930 17:30:39.914350 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8344d58c-29c7-40c3-81bb-5fc3ad4ea02b" path="/var/lib/kubelet/pods/8344d58c-29c7-40c3-81bb-5fc3ad4ea02b/volumes" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.100390 4772 scope.go:117] "RemoveContainer" containerID="3b7fa39dfb0ec84ac5f5eee196c7d64ca902556cb1cc336ffff2d5fd35a6ff10" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.153776 4772 scope.go:117] "RemoveContainer" containerID="997add3da0e114db31ad8dcdeb005e9fd720aeaa4e6e6d7381e2647af2972623" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.186592 4772 scope.go:117] "RemoveContainer" containerID="d4a1824ba74425c91844929323df348b7f9ff9219801ae6d6ff31d0f83f64888" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.238731 4772 scope.go:117] "RemoveContainer" containerID="78db73365fb6aaf5ca3d228cd401bb07e9acabc5e908def1b1d31f89ab2dec5c" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.290331 4772 scope.go:117] "RemoveContainer" containerID="b2c401b202da2439070bae59cba89ede435518cde20bf7c02161c203f3b055d0" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.347376 4772 scope.go:117] "RemoveContainer" containerID="91c2f294268261e241aac7972a7c3e2bf679dcd6facbec1710878bddc0a11327" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.385243 4772 scope.go:117] "RemoveContainer" containerID="b4ce15c683d6c40ea99bc219a4e2bf12056e181a564d45c78a204e89a72d873c" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.404003 4772 scope.go:117] "RemoveContainer" containerID="db5afbfcea9ec7c1286711ea88e82dad4d7a7f820076f7a86e2d90353a7fa099" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.430315 4772 scope.go:117] "RemoveContainer" containerID="85ffc3c53bfbef4f015e4511158ceb4aecc00e91d8d6288ff39c66ccab4fa0cb" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.469094 4772 scope.go:117] "RemoveContainer" containerID="4b05e0ddeedcc0fa91d63b33ca724dccd8da801a73e685d0e56791938aecbd32" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.488035 4772 scope.go:117] "RemoveContainer" containerID="8c859f941a1598f3fc91c4830640df9fcbc78c81830b9b5e7f3ac0bba7f846f1" Sep 30 17:30:42 crc kubenswrapper[4772]: I0930 17:30:42.513314 4772 scope.go:117] "RemoveContainer" containerID="1c70fcad8c30ec41a32bafc76e313a025ebb1b6e92442c198bafbbf0e5fd5559" Sep 30 17:30:44 crc kubenswrapper[4772]: I0930 17:30:44.046106 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-464a-account-create-r86r7"] Sep 30 17:30:44 crc kubenswrapper[4772]: I0930 17:30:44.054695 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a3ad-account-create-tv29q"] Sep 30 17:30:44 crc kubenswrapper[4772]: I0930 17:30:44.064907 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-464a-account-create-r86r7"] Sep 30 17:30:44 crc kubenswrapper[4772]: I0930 17:30:44.072254 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a3ad-account-create-tv29q"] Sep 30 17:30:45 crc kubenswrapper[4772]: I0930 17:30:45.917866 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6026a4-443b-44c1-8390-d1958c7b4f92" path="/var/lib/kubelet/pods/0c6026a4-443b-44c1-8390-d1958c7b4f92/volumes" Sep 30 17:30:45 crc kubenswrapper[4772]: I0930 17:30:45.921467 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f3b76f5-2040-45f5-ae68-118f4399738b" path="/var/lib/kubelet/pods/2f3b76f5-2040-45f5-ae68-118f4399738b/volumes" Sep 30 17:30:47 crc kubenswrapper[4772]: I0930 17:30:47.033534 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-pf2kh"] Sep 30 17:30:47 crc kubenswrapper[4772]: I0930 17:30:47.049837 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-pf2kh"] Sep 30 17:30:47 crc kubenswrapper[4772]: I0930 17:30:47.903017 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:30:47 crc kubenswrapper[4772]: E0930 17:30:47.903278 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:30:47 crc kubenswrapper[4772]: I0930 17:30:47.907979 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7" path="/var/lib/kubelet/pods/7e98d64c-5c8f-4fcb-a5a9-315cd49ab5d7/volumes" Sep 30 17:30:48 crc kubenswrapper[4772]: I0930 17:30:48.027697 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-567b-account-create-957g7"] Sep 30 17:30:48 crc kubenswrapper[4772]: I0930 17:30:48.037732 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-567b-account-create-957g7"] Sep 30 17:30:49 crc kubenswrapper[4772]: I0930 17:30:49.913335 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87675da0-8050-4d05-bc27-0c8e519a83c4" path="/var/lib/kubelet/pods/87675da0-8050-4d05-bc27-0c8e519a83c4/volumes" Sep 30 17:30:58 crc kubenswrapper[4772]: I0930 17:30:58.036965 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-l59ml"] Sep 30 17:30:58 crc kubenswrapper[4772]: I0930 17:30:58.048342 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-l59ml"] Sep 30 17:30:59 crc kubenswrapper[4772]: I0930 17:30:59.898591 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:30:59 crc kubenswrapper[4772]: E0930 17:30:59.899378 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:30:59 crc kubenswrapper[4772]: I0930 17:30:59.917154 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e531fe5c-574b-4894-b491-a46e9892d380" path="/var/lib/kubelet/pods/e531fe5c-574b-4894-b491-a46e9892d380/volumes" Sep 30 17:31:12 crc kubenswrapper[4772]: I0930 17:31:12.044918 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pwm7r"] Sep 30 17:31:12 crc kubenswrapper[4772]: I0930 17:31:12.051895 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pwm7r"] Sep 30 17:31:13 crc kubenswrapper[4772]: I0930 17:31:13.908909 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="523a44fe-7e63-47a7-9b9d-4e272994dce1" path="/var/lib/kubelet/pods/523a44fe-7e63-47a7-9b9d-4e272994dce1/volumes" Sep 30 17:31:14 crc kubenswrapper[4772]: I0930 17:31:14.898244 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:31:14 crc kubenswrapper[4772]: E0930 17:31:14.898964 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:31:23 crc kubenswrapper[4772]: I0930 17:31:23.034771 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-lvdj8"] Sep 30 17:31:23 crc kubenswrapper[4772]: I0930 17:31:23.045397 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xwzjp"] Sep 30 17:31:23 crc kubenswrapper[4772]: I0930 17:31:23.080259 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-lvdj8"] Sep 30 17:31:23 crc kubenswrapper[4772]: I0930 17:31:23.092926 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xwzjp"] Sep 30 17:31:23 crc kubenswrapper[4772]: I0930 17:31:23.912490 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23c54c4a-b5a3-4234-8cdc-62d55390d7c9" path="/var/lib/kubelet/pods/23c54c4a-b5a3-4234-8cdc-62d55390d7c9/volumes" Sep 30 17:31:23 crc kubenswrapper[4772]: I0930 17:31:23.913390 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2" path="/var/lib/kubelet/pods/40abbd90-d2b5-4ce7-a783-7e1f0e59c7b2/volumes" Sep 30 17:31:26 crc kubenswrapper[4772]: I0930 17:31:26.898927 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:31:26 crc kubenswrapper[4772]: E0930 17:31:26.899573 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:31:35 crc kubenswrapper[4772]: I0930 17:31:35.037183 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wfrg6"] Sep 30 17:31:35 crc kubenswrapper[4772]: I0930 17:31:35.046423 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wfrg6"] Sep 30 17:31:35 crc kubenswrapper[4772]: I0930 17:31:35.908238 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e25cc4-9ac5-4e36-87b0-4523bba98b4b" path="/var/lib/kubelet/pods/93e25cc4-9ac5-4e36-87b0-4523bba98b4b/volumes" Sep 30 17:31:36 crc kubenswrapper[4772]: I0930 17:31:36.028330 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-f85ns"] Sep 30 17:31:36 crc kubenswrapper[4772]: I0930 17:31:36.036564 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-f85ns"] Sep 30 17:31:37 crc kubenswrapper[4772]: I0930 17:31:37.909075 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7633806b-c365-4597-b298-1e9767c640d4" path="/var/lib/kubelet/pods/7633806b-c365-4597-b298-1e9767c640d4/volumes" Sep 30 17:31:39 crc kubenswrapper[4772]: I0930 17:31:39.905040 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:31:39 crc kubenswrapper[4772]: E0930 17:31:39.905777 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:31:41 crc kubenswrapper[4772]: I0930 17:31:41.530967 4772 generic.go:334] "Generic (PLEG): container finished" podID="d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9" containerID="2168037135d0ef970a9fc055a874188222affce2f41bb707b5d1eff228a56234" exitCode=0 Sep 30 17:31:41 crc kubenswrapper[4772]: I0930 17:31:41.531100 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" event={"ID":"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9","Type":"ContainerDied","Data":"2168037135d0ef970a9fc055a874188222affce2f41bb707b5d1eff228a56234"} Sep 30 17:31:42 crc kubenswrapper[4772]: I0930 17:31:42.729690 4772 scope.go:117] "RemoveContainer" containerID="901f6614fabc8f4551b399a76786ff27bf1ae43ea4b295635749f22328a69dd0" Sep 30 17:31:42 crc kubenswrapper[4772]: I0930 17:31:42.778451 4772 scope.go:117] "RemoveContainer" containerID="999b8f64a3c19b21dc80511ac25737d3a9d403b481e77c769b6d864026daa8c9" Sep 30 17:31:42 crc kubenswrapper[4772]: I0930 17:31:42.817800 4772 scope.go:117] "RemoveContainer" containerID="105254529cd8396980d60915c8aa90f7066b2c99b6235d19bead5afcd5b5e448" Sep 30 17:31:42 crc kubenswrapper[4772]: I0930 17:31:42.883239 4772 scope.go:117] "RemoveContainer" containerID="94eb884ebb25f57aedf87f75893a3b9cb1671eb96a2718f064598d23010344b9" Sep 30 17:31:42 crc kubenswrapper[4772]: I0930 17:31:42.959433 4772 scope.go:117] "RemoveContainer" containerID="bf446099bc0634fd206503e4ecbd53eac5f303a6d83ddda96c06c4b8ae0b2974" Sep 30 17:31:42 crc kubenswrapper[4772]: I0930 17:31:42.997734 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:31:42 crc kubenswrapper[4772]: I0930 17:31:42.999377 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-inventory\") pod \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " Sep 30 17:31:42 crc kubenswrapper[4772]: I0930 17:31:42.999404 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-ssh-key\") pod \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " Sep 30 17:31:42 crc kubenswrapper[4772]: I0930 17:31:42.999559 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2pgw\" (UniqueName: \"kubernetes.io/projected/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-kube-api-access-g2pgw\") pod \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\" (UID: \"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9\") " Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.005254 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-kube-api-access-g2pgw" (OuterVolumeSpecName: "kube-api-access-g2pgw") pod "d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9" (UID: "d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9"). InnerVolumeSpecName "kube-api-access-g2pgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.028149 4772 scope.go:117] "RemoveContainer" containerID="81995ac3da0930e34bed27a2b61146b6f1cf93a88c5de50e3f2249108d0cf3d5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.031403 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9" (UID: "d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.040441 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-inventory" (OuterVolumeSpecName: "inventory") pod "d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9" (UID: "d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.051166 4772 scope.go:117] "RemoveContainer" containerID="9c601db0ca9f168742c47a0a599aac8b2dff2aa4a71a70b9f1dfe67a7de29867" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.087615 4772 scope.go:117] "RemoveContainer" containerID="4e48e248cc6d67699c0134651aa74d5261771500766df0325c489e76ce9306ef" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.106352 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2pgw\" (UniqueName: \"kubernetes.io/projected/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-kube-api-access-g2pgw\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.106381 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.106391 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.126277 4772 scope.go:117] "RemoveContainer" containerID="497209c744bd058efc3c7294be6072f96bcb866cda13531d017880cd7e07b1e9" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.152474 4772 scope.go:117] "RemoveContainer" containerID="e603d5d35581553f301c390d087917b97aff634e438c61d654975d7104445ff7" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.548870 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" event={"ID":"d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9","Type":"ContainerDied","Data":"6a63ef02974abd4263fbdc701ce0a158664d1a55848bf39f7e113cd327d63c64"} Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.549104 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.550409 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a63ef02974abd4263fbdc701ce0a158664d1a55848bf39f7e113cd327d63c64" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.629090 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5"] Sep 30 17:31:43 crc kubenswrapper[4772]: E0930 17:31:43.629574 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.629593 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.629824 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.630727 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.634689 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.634958 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.635346 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.635346 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.639863 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5"] Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.715652 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t58j5\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.715723 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t58j5\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.715762 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4h6v\" (UniqueName: \"kubernetes.io/projected/a109e03b-45b5-4c40-91d5-d9719de7cce8-kube-api-access-m4h6v\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t58j5\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.818051 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t58j5\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.818180 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t58j5\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.818239 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4h6v\" (UniqueName: \"kubernetes.io/projected/a109e03b-45b5-4c40-91d5-d9719de7cce8-kube-api-access-m4h6v\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t58j5\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.822310 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t58j5\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.823590 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t58j5\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.835318 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4h6v\" (UniqueName: \"kubernetes.io/projected/a109e03b-45b5-4c40-91d5-d9719de7cce8-kube-api-access-m4h6v\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t58j5\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:43 crc kubenswrapper[4772]: I0930 17:31:43.953169 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:44 crc kubenswrapper[4772]: I0930 17:31:44.517574 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5"] Sep 30 17:31:44 crc kubenswrapper[4772]: I0930 17:31:44.564511 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" event={"ID":"a109e03b-45b5-4c40-91d5-d9719de7cce8","Type":"ContainerStarted","Data":"267f1ac22136dd79fae49c2cfa09ff9e6be7fe4d1c09f8282bea3d605eee3bf2"} Sep 30 17:31:45 crc kubenswrapper[4772]: I0930 17:31:45.576838 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" event={"ID":"a109e03b-45b5-4c40-91d5-d9719de7cce8","Type":"ContainerStarted","Data":"9bf5673843f926b2c944482cb626a0dce054fa127abd39c639b4cba04fcab37b"} Sep 30 17:31:45 crc kubenswrapper[4772]: I0930 17:31:45.595371 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" podStartSLOduration=1.9817531480000001 podStartE2EDuration="2.595348758s" podCreationTimestamp="2025-09-30 17:31:43 +0000 UTC" firstStartedPulling="2025-09-30 17:31:44.525826481 +0000 UTC m=+1805.432839312" lastFinishedPulling="2025-09-30 17:31:45.139422091 +0000 UTC m=+1806.046434922" observedRunningTime="2025-09-30 17:31:45.59426813 +0000 UTC m=+1806.501280991" watchObservedRunningTime="2025-09-30 17:31:45.595348758 +0000 UTC m=+1806.502361609" Sep 30 17:31:50 crc kubenswrapper[4772]: I0930 17:31:50.621631 4772 generic.go:334] "Generic (PLEG): container finished" podID="a109e03b-45b5-4c40-91d5-d9719de7cce8" containerID="9bf5673843f926b2c944482cb626a0dce054fa127abd39c639b4cba04fcab37b" exitCode=0 Sep 30 17:31:50 crc kubenswrapper[4772]: I0930 17:31:50.621714 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" event={"ID":"a109e03b-45b5-4c40-91d5-d9719de7cce8","Type":"ContainerDied","Data":"9bf5673843f926b2c944482cb626a0dce054fa127abd39c639b4cba04fcab37b"} Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.059204 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.188985 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-inventory\") pod \"a109e03b-45b5-4c40-91d5-d9719de7cce8\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.189219 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4h6v\" (UniqueName: \"kubernetes.io/projected/a109e03b-45b5-4c40-91d5-d9719de7cce8-kube-api-access-m4h6v\") pod \"a109e03b-45b5-4c40-91d5-d9719de7cce8\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.189362 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-ssh-key\") pod \"a109e03b-45b5-4c40-91d5-d9719de7cce8\" (UID: \"a109e03b-45b5-4c40-91d5-d9719de7cce8\") " Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.194545 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a109e03b-45b5-4c40-91d5-d9719de7cce8-kube-api-access-m4h6v" (OuterVolumeSpecName: "kube-api-access-m4h6v") pod "a109e03b-45b5-4c40-91d5-d9719de7cce8" (UID: "a109e03b-45b5-4c40-91d5-d9719de7cce8"). InnerVolumeSpecName "kube-api-access-m4h6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.217805 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a109e03b-45b5-4c40-91d5-d9719de7cce8" (UID: "a109e03b-45b5-4c40-91d5-d9719de7cce8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.220808 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-inventory" (OuterVolumeSpecName: "inventory") pod "a109e03b-45b5-4c40-91d5-d9719de7cce8" (UID: "a109e03b-45b5-4c40-91d5-d9719de7cce8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.292484 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.292871 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a109e03b-45b5-4c40-91d5-d9719de7cce8-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.292885 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4h6v\" (UniqueName: \"kubernetes.io/projected/a109e03b-45b5-4c40-91d5-d9719de7cce8-kube-api-access-m4h6v\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.642955 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" event={"ID":"a109e03b-45b5-4c40-91d5-d9719de7cce8","Type":"ContainerDied","Data":"267f1ac22136dd79fae49c2cfa09ff9e6be7fe4d1c09f8282bea3d605eee3bf2"} Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.642995 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="267f1ac22136dd79fae49c2cfa09ff9e6be7fe4d1c09f8282bea3d605eee3bf2" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.643014 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.728519 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc"] Sep 30 17:31:52 crc kubenswrapper[4772]: E0930 17:31:52.728929 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a109e03b-45b5-4c40-91d5-d9719de7cce8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.728946 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a109e03b-45b5-4c40-91d5-d9719de7cce8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.729157 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a109e03b-45b5-4c40-91d5-d9719de7cce8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.734932 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.737710 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.737762 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.737969 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.741174 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc"] Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.742796 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.904914 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh8pc\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.905104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh8pc\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:52 crc kubenswrapper[4772]: I0930 17:31:52.905181 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg8z7\" (UniqueName: \"kubernetes.io/projected/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-kube-api-access-wg8z7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh8pc\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:53 crc kubenswrapper[4772]: I0930 17:31:53.010024 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg8z7\" (UniqueName: \"kubernetes.io/projected/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-kube-api-access-wg8z7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh8pc\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:53 crc kubenswrapper[4772]: I0930 17:31:53.010700 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh8pc\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:53 crc kubenswrapper[4772]: I0930 17:31:53.010768 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh8pc\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:53 crc kubenswrapper[4772]: I0930 17:31:53.015139 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh8pc\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:53 crc kubenswrapper[4772]: I0930 17:31:53.018587 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh8pc\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:53 crc kubenswrapper[4772]: I0930 17:31:53.033907 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg8z7\" (UniqueName: \"kubernetes.io/projected/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-kube-api-access-wg8z7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh8pc\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:53 crc kubenswrapper[4772]: I0930 17:31:53.058169 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:31:53 crc kubenswrapper[4772]: I0930 17:31:53.582938 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc"] Sep 30 17:31:53 crc kubenswrapper[4772]: I0930 17:31:53.651617 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" event={"ID":"1a025bcd-8420-43af-b3ef-f6c3b1c5941d","Type":"ContainerStarted","Data":"c6033fd6589b463ede1fd8307905bb13df2755422dd054178b223c90b7bbbf87"} Sep 30 17:31:54 crc kubenswrapper[4772]: I0930 17:31:54.662497 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" event={"ID":"1a025bcd-8420-43af-b3ef-f6c3b1c5941d","Type":"ContainerStarted","Data":"965d300a81ec275fec1c1370c9a23947799ce1918e980807fff6123cfb2476fb"} Sep 30 17:31:54 crc kubenswrapper[4772]: I0930 17:31:54.719480 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" podStartSLOduration=2.112689659 podStartE2EDuration="2.719462616s" podCreationTimestamp="2025-09-30 17:31:52 +0000 UTC" firstStartedPulling="2025-09-30 17:31:53.589890467 +0000 UTC m=+1814.496903298" lastFinishedPulling="2025-09-30 17:31:54.196663424 +0000 UTC m=+1815.103676255" observedRunningTime="2025-09-30 17:31:54.714347442 +0000 UTC m=+1815.621360273" watchObservedRunningTime="2025-09-30 17:31:54.719462616 +0000 UTC m=+1815.626475447" Sep 30 17:31:54 crc kubenswrapper[4772]: I0930 17:31:54.898353 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:31:54 crc kubenswrapper[4772]: E0930 17:31:54.898676 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:32:07 crc kubenswrapper[4772]: I0930 17:32:07.899358 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:32:07 crc kubenswrapper[4772]: E0930 17:32:07.900728 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:32:22 crc kubenswrapper[4772]: I0930 17:32:22.898470 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:32:22 crc kubenswrapper[4772]: E0930 17:32:22.899498 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:32:34 crc kubenswrapper[4772]: I0930 17:32:34.045482 4772 generic.go:334] "Generic (PLEG): container finished" podID="1a025bcd-8420-43af-b3ef-f6c3b1c5941d" containerID="965d300a81ec275fec1c1370c9a23947799ce1918e980807fff6123cfb2476fb" exitCode=0 Sep 30 17:32:34 crc kubenswrapper[4772]: I0930 17:32:34.045576 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" event={"ID":"1a025bcd-8420-43af-b3ef-f6c3b1c5941d","Type":"ContainerDied","Data":"965d300a81ec275fec1c1370c9a23947799ce1918e980807fff6123cfb2476fb"} Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.481310 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.600595 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-inventory\") pod \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.600707 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-ssh-key\") pod \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.600738 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg8z7\" (UniqueName: \"kubernetes.io/projected/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-kube-api-access-wg8z7\") pod \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\" (UID: \"1a025bcd-8420-43af-b3ef-f6c3b1c5941d\") " Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.608644 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-kube-api-access-wg8z7" (OuterVolumeSpecName: "kube-api-access-wg8z7") pod "1a025bcd-8420-43af-b3ef-f6c3b1c5941d" (UID: "1a025bcd-8420-43af-b3ef-f6c3b1c5941d"). InnerVolumeSpecName "kube-api-access-wg8z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.629563 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-inventory" (OuterVolumeSpecName: "inventory") pod "1a025bcd-8420-43af-b3ef-f6c3b1c5941d" (UID: "1a025bcd-8420-43af-b3ef-f6c3b1c5941d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.637457 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1a025bcd-8420-43af-b3ef-f6c3b1c5941d" (UID: "1a025bcd-8420-43af-b3ef-f6c3b1c5941d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.703273 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.703315 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg8z7\" (UniqueName: \"kubernetes.io/projected/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-kube-api-access-wg8z7\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.703336 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a025bcd-8420-43af-b3ef-f6c3b1c5941d-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:35 crc kubenswrapper[4772]: I0930 17:32:35.899997 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:32:35 crc kubenswrapper[4772]: E0930 17:32:35.900274 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.063835 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" event={"ID":"1a025bcd-8420-43af-b3ef-f6c3b1c5941d","Type":"ContainerDied","Data":"c6033fd6589b463ede1fd8307905bb13df2755422dd054178b223c90b7bbbf87"} Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.063876 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6033fd6589b463ede1fd8307905bb13df2755422dd054178b223c90b7bbbf87" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.063888 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.146229 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl"] Sep 30 17:32:36 crc kubenswrapper[4772]: E0930 17:32:36.146845 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a025bcd-8420-43af-b3ef-f6c3b1c5941d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.146909 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a025bcd-8420-43af-b3ef-f6c3b1c5941d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.147301 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a025bcd-8420-43af-b3ef-f6c3b1c5941d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.148036 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.151086 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.151199 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.151853 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.152047 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.160528 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl"] Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.323531 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.323591 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.323650 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clmdp\" (UniqueName: \"kubernetes.io/projected/59fe2017-52c6-4086-ac96-73f822eb744d-kube-api-access-clmdp\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.426033 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clmdp\" (UniqueName: \"kubernetes.io/projected/59fe2017-52c6-4086-ac96-73f822eb744d-kube-api-access-clmdp\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.426231 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.426274 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.430658 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.431859 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.445008 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clmdp\" (UniqueName: \"kubernetes.io/projected/59fe2017-52c6-4086-ac96-73f822eb744d-kube-api-access-clmdp\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:36 crc kubenswrapper[4772]: I0930 17:32:36.465411 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:37 crc kubenswrapper[4772]: I0930 17:32:37.001184 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl"] Sep 30 17:32:37 crc kubenswrapper[4772]: I0930 17:32:37.074464 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" event={"ID":"59fe2017-52c6-4086-ac96-73f822eb744d","Type":"ContainerStarted","Data":"e38473676dd8a88c30c2c2ee12d01bd876db5e1b423dcb8be293a831f1a39a99"} Sep 30 17:32:39 crc kubenswrapper[4772]: I0930 17:32:39.126534 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" event={"ID":"59fe2017-52c6-4086-ac96-73f822eb744d","Type":"ContainerStarted","Data":"35ddad75ce72179dea54caffc34bc1b944e074b011f24e19e3d0101078893161"} Sep 30 17:32:39 crc kubenswrapper[4772]: I0930 17:32:39.156455 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" podStartSLOduration=2.3479430900000002 podStartE2EDuration="3.156433522s" podCreationTimestamp="2025-09-30 17:32:36 +0000 UTC" firstStartedPulling="2025-09-30 17:32:37.009384087 +0000 UTC m=+1857.916396908" lastFinishedPulling="2025-09-30 17:32:37.817874509 +0000 UTC m=+1858.724887340" observedRunningTime="2025-09-30 17:32:39.149002578 +0000 UTC m=+1860.056015409" watchObservedRunningTime="2025-09-30 17:32:39.156433522 +0000 UTC m=+1860.063446353" Sep 30 17:32:40 crc kubenswrapper[4772]: I0930 17:32:40.038157 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vqdkk"] Sep 30 17:32:40 crc kubenswrapper[4772]: I0930 17:32:40.046157 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-bnzg4"] Sep 30 17:32:40 crc kubenswrapper[4772]: I0930 17:32:40.053446 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vqdkk"] Sep 30 17:32:40 crc kubenswrapper[4772]: I0930 17:32:40.060597 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-bnzg4"] Sep 30 17:32:41 crc kubenswrapper[4772]: I0930 17:32:41.039746 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-r8bf5"] Sep 30 17:32:41 crc kubenswrapper[4772]: I0930 17:32:41.051737 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-r8bf5"] Sep 30 17:32:41 crc kubenswrapper[4772]: I0930 17:32:41.909392 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4018ba50-6562-42a4-ba6a-70d499df4c43" path="/var/lib/kubelet/pods/4018ba50-6562-42a4-ba6a-70d499df4c43/volumes" Sep 30 17:32:41 crc kubenswrapper[4772]: I0930 17:32:41.910277 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82ffc0b4-4f98-45c5-b395-ee9defa7f57d" path="/var/lib/kubelet/pods/82ffc0b4-4f98-45c5-b395-ee9defa7f57d/volumes" Sep 30 17:32:41 crc kubenswrapper[4772]: I0930 17:32:41.910784 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65e356d-bb3c-4d60-b7f1-9b29c648351e" path="/var/lib/kubelet/pods/b65e356d-bb3c-4d60-b7f1-9b29c648351e/volumes" Sep 30 17:32:43 crc kubenswrapper[4772]: I0930 17:32:43.254455 4772 generic.go:334] "Generic (PLEG): container finished" podID="59fe2017-52c6-4086-ac96-73f822eb744d" containerID="35ddad75ce72179dea54caffc34bc1b944e074b011f24e19e3d0101078893161" exitCode=0 Sep 30 17:32:43 crc kubenswrapper[4772]: I0930 17:32:43.254778 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" event={"ID":"59fe2017-52c6-4086-ac96-73f822eb744d","Type":"ContainerDied","Data":"35ddad75ce72179dea54caffc34bc1b944e074b011f24e19e3d0101078893161"} Sep 30 17:32:43 crc kubenswrapper[4772]: I0930 17:32:43.315730 4772 scope.go:117] "RemoveContainer" containerID="898b1f9cdb8d3d1669bcfb246cd68e332f98c52278b79a3263da98a9c1ea36c5" Sep 30 17:32:43 crc kubenswrapper[4772]: I0930 17:32:43.342385 4772 scope.go:117] "RemoveContainer" containerID="e0bf65f0c7f3dfa3bd2d41991afa2b8dc55b2ce58fa62e7a966c8b3e6a359764" Sep 30 17:32:43 crc kubenswrapper[4772]: I0930 17:32:43.389437 4772 scope.go:117] "RemoveContainer" containerID="11be6602a2c00ad08b0a3d8ca8a9c49225fb5060588b17cc294ecb1c8af35dea" Sep 30 17:32:44 crc kubenswrapper[4772]: I0930 17:32:44.675984 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:44 crc kubenswrapper[4772]: I0930 17:32:44.796561 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-inventory\") pod \"59fe2017-52c6-4086-ac96-73f822eb744d\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " Sep 30 17:32:44 crc kubenswrapper[4772]: I0930 17:32:44.796610 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-ssh-key\") pod \"59fe2017-52c6-4086-ac96-73f822eb744d\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " Sep 30 17:32:44 crc kubenswrapper[4772]: I0930 17:32:44.796801 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clmdp\" (UniqueName: \"kubernetes.io/projected/59fe2017-52c6-4086-ac96-73f822eb744d-kube-api-access-clmdp\") pod \"59fe2017-52c6-4086-ac96-73f822eb744d\" (UID: \"59fe2017-52c6-4086-ac96-73f822eb744d\") " Sep 30 17:32:44 crc kubenswrapper[4772]: I0930 17:32:44.804363 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59fe2017-52c6-4086-ac96-73f822eb744d-kube-api-access-clmdp" (OuterVolumeSpecName: "kube-api-access-clmdp") pod "59fe2017-52c6-4086-ac96-73f822eb744d" (UID: "59fe2017-52c6-4086-ac96-73f822eb744d"). InnerVolumeSpecName "kube-api-access-clmdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:32:44 crc kubenswrapper[4772]: I0930 17:32:44.825132 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-inventory" (OuterVolumeSpecName: "inventory") pod "59fe2017-52c6-4086-ac96-73f822eb744d" (UID: "59fe2017-52c6-4086-ac96-73f822eb744d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:32:44 crc kubenswrapper[4772]: I0930 17:32:44.825471 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59fe2017-52c6-4086-ac96-73f822eb744d" (UID: "59fe2017-52c6-4086-ac96-73f822eb744d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:32:44 crc kubenswrapper[4772]: I0930 17:32:44.899102 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:44 crc kubenswrapper[4772]: I0930 17:32:44.899137 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59fe2017-52c6-4086-ac96-73f822eb744d-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:44 crc kubenswrapper[4772]: I0930 17:32:44.899152 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clmdp\" (UniqueName: \"kubernetes.io/projected/59fe2017-52c6-4086-ac96-73f822eb744d-kube-api-access-clmdp\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.272383 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" event={"ID":"59fe2017-52c6-4086-ac96-73f822eb744d","Type":"ContainerDied","Data":"e38473676dd8a88c30c2c2ee12d01bd876db5e1b423dcb8be293a831f1a39a99"} Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.272421 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e38473676dd8a88c30c2c2ee12d01bd876db5e1b423dcb8be293a831f1a39a99" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.272474 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.339957 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l"] Sep 30 17:32:45 crc kubenswrapper[4772]: E0930 17:32:45.340350 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fe2017-52c6-4086-ac96-73f822eb744d" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.340368 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fe2017-52c6-4086-ac96-73f822eb744d" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.340535 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="59fe2017-52c6-4086-ac96-73f822eb744d" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.341188 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.350303 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.350431 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.350556 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.350441 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.352001 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l"] Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.408229 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-89s5l\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.408708 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-89s5l\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.408732 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcqw8\" (UniqueName: \"kubernetes.io/projected/527cfb73-f3fa-4746-8174-788942e65624-kube-api-access-kcqw8\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-89s5l\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.510829 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-89s5l\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.510981 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-89s5l\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.511006 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcqw8\" (UniqueName: \"kubernetes.io/projected/527cfb73-f3fa-4746-8174-788942e65624-kube-api-access-kcqw8\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-89s5l\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.516000 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-89s5l\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.518733 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-89s5l\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.528220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcqw8\" (UniqueName: \"kubernetes.io/projected/527cfb73-f3fa-4746-8174-788942e65624-kube-api-access-kcqw8\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-89s5l\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:45 crc kubenswrapper[4772]: I0930 17:32:45.682852 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:32:46 crc kubenswrapper[4772]: I0930 17:32:46.224316 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l"] Sep 30 17:32:46 crc kubenswrapper[4772]: I0930 17:32:46.288932 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" event={"ID":"527cfb73-f3fa-4746-8174-788942e65624","Type":"ContainerStarted","Data":"1ee20b29554f00cdefeae6afff11414c443c2bc471f08ab3f0657cf547c6764e"} Sep 30 17:32:47 crc kubenswrapper[4772]: I0930 17:32:47.299451 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" event={"ID":"527cfb73-f3fa-4746-8174-788942e65624","Type":"ContainerStarted","Data":"2cc52942941c96f043717c0024add281d310abf2fab02cb7365b1a4a2029d083"} Sep 30 17:32:47 crc kubenswrapper[4772]: I0930 17:32:47.321263 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" podStartSLOduration=1.7986100889999999 podStartE2EDuration="2.321244106s" podCreationTimestamp="2025-09-30 17:32:45 +0000 UTC" firstStartedPulling="2025-09-30 17:32:46.222876013 +0000 UTC m=+1867.129888844" lastFinishedPulling="2025-09-30 17:32:46.74551003 +0000 UTC m=+1867.652522861" observedRunningTime="2025-09-30 17:32:47.313244417 +0000 UTC m=+1868.220257248" watchObservedRunningTime="2025-09-30 17:32:47.321244106 +0000 UTC m=+1868.228256937" Sep 30 17:32:47 crc kubenswrapper[4772]: I0930 17:32:47.898955 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:32:47 crc kubenswrapper[4772]: E0930 17:32:47.899281 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:32:50 crc kubenswrapper[4772]: I0930 17:32:50.029815 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-066d-account-create-7ntjb"] Sep 30 17:32:50 crc kubenswrapper[4772]: I0930 17:32:50.043102 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-066d-account-create-7ntjb"] Sep 30 17:32:51 crc kubenswrapper[4772]: I0930 17:32:51.045865 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-adee-account-create-vcvm9"] Sep 30 17:32:51 crc kubenswrapper[4772]: I0930 17:32:51.061761 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-adee-account-create-vcvm9"] Sep 30 17:32:51 crc kubenswrapper[4772]: I0930 17:32:51.071029 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1dab-account-create-h2p9r"] Sep 30 17:32:51 crc kubenswrapper[4772]: I0930 17:32:51.081678 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1dab-account-create-h2p9r"] Sep 30 17:32:51 crc kubenswrapper[4772]: I0930 17:32:51.908723 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4940e5ae-c8a0-497e-884c-32b360630a9a" path="/var/lib/kubelet/pods/4940e5ae-c8a0-497e-884c-32b360630a9a/volumes" Sep 30 17:32:51 crc kubenswrapper[4772]: I0930 17:32:51.909411 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565231b6-3cd5-4d24-bb02-114c04ef14f6" path="/var/lib/kubelet/pods/565231b6-3cd5-4d24-bb02-114c04ef14f6/volumes" Sep 30 17:32:51 crc kubenswrapper[4772]: I0930 17:32:51.909901 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8412255b-f9d0-4df8-a46a-b2e5f929322e" path="/var/lib/kubelet/pods/8412255b-f9d0-4df8-a46a-b2e5f929322e/volumes" Sep 30 17:33:00 crc kubenswrapper[4772]: I0930 17:33:00.898631 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:33:00 crc kubenswrapper[4772]: E0930 17:33:00.899532 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:33:11 crc kubenswrapper[4772]: I0930 17:33:11.898781 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:33:11 crc kubenswrapper[4772]: E0930 17:33:11.899591 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:33:26 crc kubenswrapper[4772]: I0930 17:33:26.898992 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:33:26 crc kubenswrapper[4772]: E0930 17:33:26.900009 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:33:40 crc kubenswrapper[4772]: I0930 17:33:40.898228 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:33:41 crc kubenswrapper[4772]: I0930 17:33:41.798640 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"aa262c38b488c0fff98a713c814bcc2a49aaeb671dd1e2237d6106aaff892d76"} Sep 30 17:33:43 crc kubenswrapper[4772]: I0930 17:33:43.507240 4772 scope.go:117] "RemoveContainer" containerID="d95e8415dcd8b91a41f27ba874da5277dab1216884ebc147e2dc2d3d2ba245ce" Sep 30 17:33:43 crc kubenswrapper[4772]: I0930 17:33:43.530016 4772 scope.go:117] "RemoveContainer" containerID="c475f9da79ee17b14ab01114b90aa45593cd79f2b039b5ab7ccbef79e13d3837" Sep 30 17:33:43 crc kubenswrapper[4772]: I0930 17:33:43.577825 4772 scope.go:117] "RemoveContainer" containerID="24bcff46048882e0c870963a4248e25fcbedec9f5c29aaae4c662cd99ff2c760" Sep 30 17:33:43 crc kubenswrapper[4772]: I0930 17:33:43.815926 4772 generic.go:334] "Generic (PLEG): container finished" podID="527cfb73-f3fa-4746-8174-788942e65624" containerID="2cc52942941c96f043717c0024add281d310abf2fab02cb7365b1a4a2029d083" exitCode=2 Sep 30 17:33:43 crc kubenswrapper[4772]: I0930 17:33:43.815994 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" event={"ID":"527cfb73-f3fa-4746-8174-788942e65624","Type":"ContainerDied","Data":"2cc52942941c96f043717c0024add281d310abf2fab02cb7365b1a4a2029d083"} Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.259385 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.425245 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-ssh-key\") pod \"527cfb73-f3fa-4746-8174-788942e65624\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.425678 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcqw8\" (UniqueName: \"kubernetes.io/projected/527cfb73-f3fa-4746-8174-788942e65624-kube-api-access-kcqw8\") pod \"527cfb73-f3fa-4746-8174-788942e65624\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.425711 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-inventory\") pod \"527cfb73-f3fa-4746-8174-788942e65624\" (UID: \"527cfb73-f3fa-4746-8174-788942e65624\") " Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.432968 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527cfb73-f3fa-4746-8174-788942e65624-kube-api-access-kcqw8" (OuterVolumeSpecName: "kube-api-access-kcqw8") pod "527cfb73-f3fa-4746-8174-788942e65624" (UID: "527cfb73-f3fa-4746-8174-788942e65624"). InnerVolumeSpecName "kube-api-access-kcqw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.457075 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-inventory" (OuterVolumeSpecName: "inventory") pod "527cfb73-f3fa-4746-8174-788942e65624" (UID: "527cfb73-f3fa-4746-8174-788942e65624"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.457483 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "527cfb73-f3fa-4746-8174-788942e65624" (UID: "527cfb73-f3fa-4746-8174-788942e65624"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.529859 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.529963 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcqw8\" (UniqueName: \"kubernetes.io/projected/527cfb73-f3fa-4746-8174-788942e65624-kube-api-access-kcqw8\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.529994 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/527cfb73-f3fa-4746-8174-788942e65624-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.836841 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" event={"ID":"527cfb73-f3fa-4746-8174-788942e65624","Type":"ContainerDied","Data":"1ee20b29554f00cdefeae6afff11414c443c2bc471f08ab3f0657cf547c6764e"} Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.836888 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ee20b29554f00cdefeae6afff11414c443c2bc471f08ab3f0657cf547c6764e" Sep 30 17:33:45 crc kubenswrapper[4772]: I0930 17:33:45.836992 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.030477 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9"] Sep 30 17:33:53 crc kubenswrapper[4772]: E0930 17:33:53.031673 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527cfb73-f3fa-4746-8174-788942e65624" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.031688 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="527cfb73-f3fa-4746-8174-788942e65624" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.031892 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="527cfb73-f3fa-4746-8174-788942e65624" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.032564 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.034881 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.035383 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.035442 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.035664 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.046041 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9"] Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.199872 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npbwz\" (UniqueName: \"kubernetes.io/projected/fecfdbbc-226d-4daf-b162-43aeefc9d100-kube-api-access-npbwz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-56tf9\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.199939 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-56tf9\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.199991 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-56tf9\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.302250 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-56tf9\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.302396 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npbwz\" (UniqueName: \"kubernetes.io/projected/fecfdbbc-226d-4daf-b162-43aeefc9d100-kube-api-access-npbwz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-56tf9\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.302437 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-56tf9\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.308489 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-56tf9\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.308514 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-56tf9\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.320364 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npbwz\" (UniqueName: \"kubernetes.io/projected/fecfdbbc-226d-4daf-b162-43aeefc9d100-kube-api-access-npbwz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-56tf9\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.421318 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:33:53 crc kubenswrapper[4772]: I0930 17:33:53.976746 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9"] Sep 30 17:33:54 crc kubenswrapper[4772]: I0930 17:33:54.920834 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" event={"ID":"fecfdbbc-226d-4daf-b162-43aeefc9d100","Type":"ContainerStarted","Data":"02a56a314fbacc630045857f1f3bc8fefc2d3ae54e6874c6cf9603efccd6e32f"} Sep 30 17:33:54 crc kubenswrapper[4772]: I0930 17:33:54.921294 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" event={"ID":"fecfdbbc-226d-4daf-b162-43aeefc9d100","Type":"ContainerStarted","Data":"6a9acec2102c07c1453f1a0cf7ab5f393375554c94ab730e7d4340137feb7fd0"} Sep 30 17:33:55 crc kubenswrapper[4772]: I0930 17:33:55.967843 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" podStartSLOduration=2.349235042 podStartE2EDuration="2.967824849s" podCreationTimestamp="2025-09-30 17:33:53 +0000 UTC" firstStartedPulling="2025-09-30 17:33:53.979343479 +0000 UTC m=+1934.886356310" lastFinishedPulling="2025-09-30 17:33:54.597933266 +0000 UTC m=+1935.504946117" observedRunningTime="2025-09-30 17:33:55.960839857 +0000 UTC m=+1936.867852688" watchObservedRunningTime="2025-09-30 17:33:55.967824849 +0000 UTC m=+1936.874837680" Sep 30 17:34:19 crc kubenswrapper[4772]: I0930 17:34:19.065106 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-grhrs"] Sep 30 17:34:19 crc kubenswrapper[4772]: I0930 17:34:19.072179 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-grhrs"] Sep 30 17:34:19 crc kubenswrapper[4772]: I0930 17:34:19.910315 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7" path="/var/lib/kubelet/pods/9c1e3468-2d4b-40a3-b9f4-fac58b13b5a7/volumes" Sep 30 17:34:42 crc kubenswrapper[4772]: I0930 17:34:42.046908 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnbhb"] Sep 30 17:34:42 crc kubenswrapper[4772]: I0930 17:34:42.063224 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnbhb"] Sep 30 17:34:43 crc kubenswrapper[4772]: I0930 17:34:43.681476 4772 scope.go:117] "RemoveContainer" containerID="79b64bea6aaa5c47f6b2caac074731cf66c3ed741930fc4e98fa8683a63f6bae" Sep 30 17:34:43 crc kubenswrapper[4772]: I0930 17:34:43.909692 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="333bd9e9-4bac-49af-9d96-25c2c03cb96a" path="/var/lib/kubelet/pods/333bd9e9-4bac-49af-9d96-25c2c03cb96a/volumes" Sep 30 17:34:44 crc kubenswrapper[4772]: I0930 17:34:44.030097 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7qklz"] Sep 30 17:34:44 crc kubenswrapper[4772]: I0930 17:34:44.040850 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7qklz"] Sep 30 17:34:45 crc kubenswrapper[4772]: I0930 17:34:45.417739 4772 generic.go:334] "Generic (PLEG): container finished" podID="fecfdbbc-226d-4daf-b162-43aeefc9d100" containerID="02a56a314fbacc630045857f1f3bc8fefc2d3ae54e6874c6cf9603efccd6e32f" exitCode=0 Sep 30 17:34:45 crc kubenswrapper[4772]: I0930 17:34:45.417859 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" event={"ID":"fecfdbbc-226d-4daf-b162-43aeefc9d100","Type":"ContainerDied","Data":"02a56a314fbacc630045857f1f3bc8fefc2d3ae54e6874c6cf9603efccd6e32f"} Sep 30 17:34:45 crc kubenswrapper[4772]: I0930 17:34:45.912868 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f7d470d-5fe9-4d90-a24b-705f8af5d35d" path="/var/lib/kubelet/pods/7f7d470d-5fe9-4d90-a24b-705f8af5d35d/volumes" Sep 30 17:34:46 crc kubenswrapper[4772]: I0930 17:34:46.873310 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.041392 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-inventory\") pod \"fecfdbbc-226d-4daf-b162-43aeefc9d100\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.041506 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-ssh-key\") pod \"fecfdbbc-226d-4daf-b162-43aeefc9d100\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.041613 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npbwz\" (UniqueName: \"kubernetes.io/projected/fecfdbbc-226d-4daf-b162-43aeefc9d100-kube-api-access-npbwz\") pod \"fecfdbbc-226d-4daf-b162-43aeefc9d100\" (UID: \"fecfdbbc-226d-4daf-b162-43aeefc9d100\") " Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.048901 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fecfdbbc-226d-4daf-b162-43aeefc9d100-kube-api-access-npbwz" (OuterVolumeSpecName: "kube-api-access-npbwz") pod "fecfdbbc-226d-4daf-b162-43aeefc9d100" (UID: "fecfdbbc-226d-4daf-b162-43aeefc9d100"). InnerVolumeSpecName "kube-api-access-npbwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.072340 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fecfdbbc-226d-4daf-b162-43aeefc9d100" (UID: "fecfdbbc-226d-4daf-b162-43aeefc9d100"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.072901 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-inventory" (OuterVolumeSpecName: "inventory") pod "fecfdbbc-226d-4daf-b162-43aeefc9d100" (UID: "fecfdbbc-226d-4daf-b162-43aeefc9d100"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.144145 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.144177 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fecfdbbc-226d-4daf-b162-43aeefc9d100-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.144187 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npbwz\" (UniqueName: \"kubernetes.io/projected/fecfdbbc-226d-4daf-b162-43aeefc9d100-kube-api-access-npbwz\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.435549 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" event={"ID":"fecfdbbc-226d-4daf-b162-43aeefc9d100","Type":"ContainerDied","Data":"6a9acec2102c07c1453f1a0cf7ab5f393375554c94ab730e7d4340137feb7fd0"} Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.435592 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a9acec2102c07c1453f1a0cf7ab5f393375554c94ab730e7d4340137feb7fd0" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.435643 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.516079 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p94c6"] Sep 30 17:34:47 crc kubenswrapper[4772]: E0930 17:34:47.516528 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecfdbbc-226d-4daf-b162-43aeefc9d100" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.516542 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecfdbbc-226d-4daf-b162-43aeefc9d100" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.516714 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecfdbbc-226d-4daf-b162-43aeefc9d100" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.517375 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.522610 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.522995 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.523637 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.523793 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.531598 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p94c6"] Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.652421 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p94c6\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.652473 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p94c6\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.652508 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wvh\" (UniqueName: \"kubernetes.io/projected/bf660345-e15a-46d9-b1f2-8cd460d61a9d-kube-api-access-87wvh\") pod \"ssh-known-hosts-edpm-deployment-p94c6\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.755038 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p94c6\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.755126 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p94c6\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.755166 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87wvh\" (UniqueName: \"kubernetes.io/projected/bf660345-e15a-46d9-b1f2-8cd460d61a9d-kube-api-access-87wvh\") pod \"ssh-known-hosts-edpm-deployment-p94c6\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.759831 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p94c6\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.760007 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p94c6\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.790304 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wvh\" (UniqueName: \"kubernetes.io/projected/bf660345-e15a-46d9-b1f2-8cd460d61a9d-kube-api-access-87wvh\") pod \"ssh-known-hosts-edpm-deployment-p94c6\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:47 crc kubenswrapper[4772]: I0930 17:34:47.840582 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:48 crc kubenswrapper[4772]: I0930 17:34:48.398962 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p94c6"] Sep 30 17:34:48 crc kubenswrapper[4772]: W0930 17:34:48.406037 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf660345_e15a_46d9_b1f2_8cd460d61a9d.slice/crio-819a64152f3cba1c5cc1ce45005a0feb57c86b15dc0c873337d820eece35648e WatchSource:0}: Error finding container 819a64152f3cba1c5cc1ce45005a0feb57c86b15dc0c873337d820eece35648e: Status 404 returned error can't find the container with id 819a64152f3cba1c5cc1ce45005a0feb57c86b15dc0c873337d820eece35648e Sep 30 17:34:48 crc kubenswrapper[4772]: I0930 17:34:48.444904 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" event={"ID":"bf660345-e15a-46d9-b1f2-8cd460d61a9d","Type":"ContainerStarted","Data":"819a64152f3cba1c5cc1ce45005a0feb57c86b15dc0c873337d820eece35648e"} Sep 30 17:34:49 crc kubenswrapper[4772]: I0930 17:34:49.455791 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" event={"ID":"bf660345-e15a-46d9-b1f2-8cd460d61a9d","Type":"ContainerStarted","Data":"70b341f0919502bad9dc4b69c8c26525acfc5146983f554488efdb7041b8a4ec"} Sep 30 17:34:49 crc kubenswrapper[4772]: I0930 17:34:49.470588 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" podStartSLOduration=2.062406797 podStartE2EDuration="2.470570198s" podCreationTimestamp="2025-09-30 17:34:47 +0000 UTC" firstStartedPulling="2025-09-30 17:34:48.40842517 +0000 UTC m=+1989.315438001" lastFinishedPulling="2025-09-30 17:34:48.816588571 +0000 UTC m=+1989.723601402" observedRunningTime="2025-09-30 17:34:49.468176305 +0000 UTC m=+1990.375189156" watchObservedRunningTime="2025-09-30 17:34:49.470570198 +0000 UTC m=+1990.377583029" Sep 30 17:34:56 crc kubenswrapper[4772]: I0930 17:34:56.524774 4772 generic.go:334] "Generic (PLEG): container finished" podID="bf660345-e15a-46d9-b1f2-8cd460d61a9d" containerID="70b341f0919502bad9dc4b69c8c26525acfc5146983f554488efdb7041b8a4ec" exitCode=0 Sep 30 17:34:56 crc kubenswrapper[4772]: I0930 17:34:56.524888 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" event={"ID":"bf660345-e15a-46d9-b1f2-8cd460d61a9d","Type":"ContainerDied","Data":"70b341f0919502bad9dc4b69c8c26525acfc5146983f554488efdb7041b8a4ec"} Sep 30 17:34:57 crc kubenswrapper[4772]: I0930 17:34:57.939717 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.056881 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87wvh\" (UniqueName: \"kubernetes.io/projected/bf660345-e15a-46d9-b1f2-8cd460d61a9d-kube-api-access-87wvh\") pod \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.056994 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-inventory-0\") pod \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.057230 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-ssh-key-openstack-edpm-ipam\") pod \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\" (UID: \"bf660345-e15a-46d9-b1f2-8cd460d61a9d\") " Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.068086 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf660345-e15a-46d9-b1f2-8cd460d61a9d-kube-api-access-87wvh" (OuterVolumeSpecName: "kube-api-access-87wvh") pod "bf660345-e15a-46d9-b1f2-8cd460d61a9d" (UID: "bf660345-e15a-46d9-b1f2-8cd460d61a9d"). InnerVolumeSpecName "kube-api-access-87wvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.085969 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "bf660345-e15a-46d9-b1f2-8cd460d61a9d" (UID: "bf660345-e15a-46d9-b1f2-8cd460d61a9d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.091130 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf660345-e15a-46d9-b1f2-8cd460d61a9d" (UID: "bf660345-e15a-46d9-b1f2-8cd460d61a9d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.159732 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87wvh\" (UniqueName: \"kubernetes.io/projected/bf660345-e15a-46d9-b1f2-8cd460d61a9d-kube-api-access-87wvh\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.159765 4772 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.159777 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf660345-e15a-46d9-b1f2-8cd460d61a9d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.549118 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" event={"ID":"bf660345-e15a-46d9-b1f2-8cd460d61a9d","Type":"ContainerDied","Data":"819a64152f3cba1c5cc1ce45005a0feb57c86b15dc0c873337d820eece35648e"} Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.549666 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="819a64152f3cba1c5cc1ce45005a0feb57c86b15dc0c873337d820eece35648e" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.549267 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p94c6" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.635968 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4"] Sep 30 17:34:58 crc kubenswrapper[4772]: E0930 17:34:58.636534 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf660345-e15a-46d9-b1f2-8cd460d61a9d" containerName="ssh-known-hosts-edpm-deployment" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.636554 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf660345-e15a-46d9-b1f2-8cd460d61a9d" containerName="ssh-known-hosts-edpm-deployment" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.636769 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf660345-e15a-46d9-b1f2-8cd460d61a9d" containerName="ssh-known-hosts-edpm-deployment" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.637763 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.640410 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.640462 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.640863 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.643445 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.646789 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4"] Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.786677 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7vzz4\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.786773 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7vzz4\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.786909 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s5rq\" (UniqueName: \"kubernetes.io/projected/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-kube-api-access-6s5rq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7vzz4\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.889159 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7vzz4\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.889243 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7vzz4\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.889299 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s5rq\" (UniqueName: \"kubernetes.io/projected/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-kube-api-access-6s5rq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7vzz4\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.895089 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7vzz4\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.899894 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7vzz4\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.908622 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s5rq\" (UniqueName: \"kubernetes.io/projected/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-kube-api-access-6s5rq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7vzz4\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:58 crc kubenswrapper[4772]: I0930 17:34:58.977257 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:34:59 crc kubenswrapper[4772]: I0930 17:34:59.545836 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4"] Sep 30 17:34:59 crc kubenswrapper[4772]: I0930 17:34:59.548345 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:34:59 crc kubenswrapper[4772]: I0930 17:34:59.560227 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" event={"ID":"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0","Type":"ContainerStarted","Data":"5e61d23f894b97549d209e942272ddfbbc88df77219170f6977db2708488637a"} Sep 30 17:35:00 crc kubenswrapper[4772]: I0930 17:35:00.570823 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" event={"ID":"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0","Type":"ContainerStarted","Data":"5bbd02d06ae963aa1269f44bbbaf1b8a4046da66809139963d3afe2fa9fbbfbb"} Sep 30 17:35:00 crc kubenswrapper[4772]: I0930 17:35:00.595552 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" podStartSLOduration=2.095930905 podStartE2EDuration="2.595526287s" podCreationTimestamp="2025-09-30 17:34:58 +0000 UTC" firstStartedPulling="2025-09-30 17:34:59.548145317 +0000 UTC m=+2000.455158148" lastFinishedPulling="2025-09-30 17:35:00.047740709 +0000 UTC m=+2000.954753530" observedRunningTime="2025-09-30 17:35:00.592143698 +0000 UTC m=+2001.499156549" watchObservedRunningTime="2025-09-30 17:35:00.595526287 +0000 UTC m=+2001.502539118" Sep 30 17:35:08 crc kubenswrapper[4772]: I0930 17:35:08.646505 4772 generic.go:334] "Generic (PLEG): container finished" podID="bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0" containerID="5bbd02d06ae963aa1269f44bbbaf1b8a4046da66809139963d3afe2fa9fbbfbb" exitCode=0 Sep 30 17:35:08 crc kubenswrapper[4772]: I0930 17:35:08.646621 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" event={"ID":"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0","Type":"ContainerDied","Data":"5bbd02d06ae963aa1269f44bbbaf1b8a4046da66809139963d3afe2fa9fbbfbb"} Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.100441 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.221782 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-ssh-key\") pod \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.221891 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s5rq\" (UniqueName: \"kubernetes.io/projected/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-kube-api-access-6s5rq\") pod \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.222000 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-inventory\") pod \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\" (UID: \"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0\") " Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.231498 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-kube-api-access-6s5rq" (OuterVolumeSpecName: "kube-api-access-6s5rq") pod "bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0" (UID: "bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0"). InnerVolumeSpecName "kube-api-access-6s5rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.253399 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-inventory" (OuterVolumeSpecName: "inventory") pod "bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0" (UID: "bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.267803 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0" (UID: "bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.325630 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s5rq\" (UniqueName: \"kubernetes.io/projected/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-kube-api-access-6s5rq\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.325668 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.325681 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.670908 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" event={"ID":"bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0","Type":"ContainerDied","Data":"5e61d23f894b97549d209e942272ddfbbc88df77219170f6977db2708488637a"} Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.670955 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e61d23f894b97549d209e942272ddfbbc88df77219170f6977db2708488637a" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.670953 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.735865 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k"] Sep 30 17:35:10 crc kubenswrapper[4772]: E0930 17:35:10.736370 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.736395 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.736598 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.737750 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.748437 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.748670 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.748789 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.749004 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.760003 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k"] Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.834763 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.834847 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48ct\" (UniqueName: \"kubernetes.io/projected/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-kube-api-access-m48ct\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.834882 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.936693 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m48ct\" (UniqueName: \"kubernetes.io/projected/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-kube-api-access-m48ct\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.936760 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.936892 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.940529 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.940675 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:10 crc kubenswrapper[4772]: I0930 17:35:10.953670 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48ct\" (UniqueName: \"kubernetes.io/projected/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-kube-api-access-m48ct\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:11 crc kubenswrapper[4772]: I0930 17:35:11.078470 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:11 crc kubenswrapper[4772]: I0930 17:35:11.712006 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k"] Sep 30 17:35:11 crc kubenswrapper[4772]: W0930 17:35:11.714126 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcae3e1d5_7f5f_468b_8b80_a8d0c012d017.slice/crio-ebb2419f49788a5e223784af3c9d7dc24af3315d67e0c490aeb1dbda1fabf344 WatchSource:0}: Error finding container ebb2419f49788a5e223784af3c9d7dc24af3315d67e0c490aeb1dbda1fabf344: Status 404 returned error can't find the container with id ebb2419f49788a5e223784af3c9d7dc24af3315d67e0c490aeb1dbda1fabf344 Sep 30 17:35:12 crc kubenswrapper[4772]: I0930 17:35:12.688849 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" event={"ID":"cae3e1d5-7f5f-468b-8b80-a8d0c012d017","Type":"ContainerStarted","Data":"aa122e777b22c73b768dd39c8ebe05a682a970f6136f37f70f3b94181a745922"} Sep 30 17:35:12 crc kubenswrapper[4772]: I0930 17:35:12.689437 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" event={"ID":"cae3e1d5-7f5f-468b-8b80-a8d0c012d017","Type":"ContainerStarted","Data":"ebb2419f49788a5e223784af3c9d7dc24af3315d67e0c490aeb1dbda1fabf344"} Sep 30 17:35:12 crc kubenswrapper[4772]: I0930 17:35:12.703106 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" podStartSLOduration=1.975225407 podStartE2EDuration="2.703092085s" podCreationTimestamp="2025-09-30 17:35:10 +0000 UTC" firstStartedPulling="2025-09-30 17:35:11.716527123 +0000 UTC m=+2012.623539954" lastFinishedPulling="2025-09-30 17:35:12.444393801 +0000 UTC m=+2013.351406632" observedRunningTime="2025-09-30 17:35:12.701612136 +0000 UTC m=+2013.608624967" watchObservedRunningTime="2025-09-30 17:35:12.703092085 +0000 UTC m=+2013.610104916" Sep 30 17:35:22 crc kubenswrapper[4772]: I0930 17:35:22.778281 4772 generic.go:334] "Generic (PLEG): container finished" podID="cae3e1d5-7f5f-468b-8b80-a8d0c012d017" containerID="aa122e777b22c73b768dd39c8ebe05a682a970f6136f37f70f3b94181a745922" exitCode=0 Sep 30 17:35:22 crc kubenswrapper[4772]: I0930 17:35:22.778382 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" event={"ID":"cae3e1d5-7f5f-468b-8b80-a8d0c012d017","Type":"ContainerDied","Data":"aa122e777b22c73b768dd39c8ebe05a682a970f6136f37f70f3b94181a745922"} Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.160471 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.199233 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-inventory\") pod \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.199484 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m48ct\" (UniqueName: \"kubernetes.io/projected/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-kube-api-access-m48ct\") pod \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.199585 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-ssh-key\") pod \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\" (UID: \"cae3e1d5-7f5f-468b-8b80-a8d0c012d017\") " Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.206123 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-kube-api-access-m48ct" (OuterVolumeSpecName: "kube-api-access-m48ct") pod "cae3e1d5-7f5f-468b-8b80-a8d0c012d017" (UID: "cae3e1d5-7f5f-468b-8b80-a8d0c012d017"). InnerVolumeSpecName "kube-api-access-m48ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.226026 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cae3e1d5-7f5f-468b-8b80-a8d0c012d017" (UID: "cae3e1d5-7f5f-468b-8b80-a8d0c012d017"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.232257 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-inventory" (OuterVolumeSpecName: "inventory") pod "cae3e1d5-7f5f-468b-8b80-a8d0c012d017" (UID: "cae3e1d5-7f5f-468b-8b80-a8d0c012d017"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.301753 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m48ct\" (UniqueName: \"kubernetes.io/projected/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-kube-api-access-m48ct\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.301789 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.301800 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cae3e1d5-7f5f-468b-8b80-a8d0c012d017-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.800448 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" event={"ID":"cae3e1d5-7f5f-468b-8b80-a8d0c012d017","Type":"ContainerDied","Data":"ebb2419f49788a5e223784af3c9d7dc24af3315d67e0c490aeb1dbda1fabf344"} Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.801002 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebb2419f49788a5e223784af3c9d7dc24af3315d67e0c490aeb1dbda1fabf344" Sep 30 17:35:24 crc kubenswrapper[4772]: I0930 17:35:24.800516 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k" Sep 30 17:35:27 crc kubenswrapper[4772]: I0930 17:35:27.040903 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gwmjj"] Sep 30 17:35:27 crc kubenswrapper[4772]: I0930 17:35:27.050164 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gwmjj"] Sep 30 17:35:27 crc kubenswrapper[4772]: I0930 17:35:27.909518 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d4e6bec-dedc-4022-a6e2-d615ec89a5db" path="/var/lib/kubelet/pods/4d4e6bec-dedc-4022-a6e2-d615ec89a5db/volumes" Sep 30 17:35:43 crc kubenswrapper[4772]: I0930 17:35:43.760689 4772 scope.go:117] "RemoveContainer" containerID="a4dc299d7e32e1abf9062711f58d5b154593c40e3440f02cc0dacc20b5b7f014" Sep 30 17:35:43 crc kubenswrapper[4772]: I0930 17:35:43.810800 4772 scope.go:117] "RemoveContainer" containerID="130bb11040581823309c0a46d0fca21013d2a85f029bb5f10c13a83101632746" Sep 30 17:35:43 crc kubenswrapper[4772]: I0930 17:35:43.862438 4772 scope.go:117] "RemoveContainer" containerID="6c5c0f407021dbbf16dbddb948491bd9cc0c44ef6d19cee05096b9e686ad98c6" Sep 30 17:36:08 crc kubenswrapper[4772]: I0930 17:36:08.655548 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:36:08 crc kubenswrapper[4772]: I0930 17:36:08.656091 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:36:38 crc kubenswrapper[4772]: I0930 17:36:38.655589 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:36:38 crc kubenswrapper[4772]: I0930 17:36:38.656223 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.450609 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gx22r"] Sep 30 17:36:45 crc kubenswrapper[4772]: E0930 17:36:45.451710 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae3e1d5-7f5f-468b-8b80-a8d0c012d017" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.451727 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae3e1d5-7f5f-468b-8b80-a8d0c012d017" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.451954 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae3e1d5-7f5f-468b-8b80-a8d0c012d017" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.453691 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.487290 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gx22r"] Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.523429 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-utilities\") pod \"community-operators-gx22r\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.523542 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s55hx\" (UniqueName: \"kubernetes.io/projected/65ee69d2-5a95-4286-8773-334ef2f2d95a-kube-api-access-s55hx\") pod \"community-operators-gx22r\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.523598 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-catalog-content\") pod \"community-operators-gx22r\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.625602 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-utilities\") pod \"community-operators-gx22r\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.625728 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s55hx\" (UniqueName: \"kubernetes.io/projected/65ee69d2-5a95-4286-8773-334ef2f2d95a-kube-api-access-s55hx\") pod \"community-operators-gx22r\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.625818 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-catalog-content\") pod \"community-operators-gx22r\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.626455 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-utilities\") pod \"community-operators-gx22r\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.626538 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-catalog-content\") pod \"community-operators-gx22r\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.674021 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s55hx\" (UniqueName: \"kubernetes.io/projected/65ee69d2-5a95-4286-8773-334ef2f2d95a-kube-api-access-s55hx\") pod \"community-operators-gx22r\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:45 crc kubenswrapper[4772]: I0930 17:36:45.778665 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:46 crc kubenswrapper[4772]: I0930 17:36:46.290936 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gx22r"] Sep 30 17:36:46 crc kubenswrapper[4772]: I0930 17:36:46.557463 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx22r" event={"ID":"65ee69d2-5a95-4286-8773-334ef2f2d95a","Type":"ContainerStarted","Data":"73a82c160fbbc169c0da8b93884db4eeb3e22d334830b16db9b9ab1e4e565494"} Sep 30 17:36:47 crc kubenswrapper[4772]: I0930 17:36:47.566237 4772 generic.go:334] "Generic (PLEG): container finished" podID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerID="59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7" exitCode=0 Sep 30 17:36:47 crc kubenswrapper[4772]: I0930 17:36:47.566276 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx22r" event={"ID":"65ee69d2-5a95-4286-8773-334ef2f2d95a","Type":"ContainerDied","Data":"59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7"} Sep 30 17:36:49 crc kubenswrapper[4772]: I0930 17:36:49.586349 4772 generic.go:334] "Generic (PLEG): container finished" podID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerID="089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2" exitCode=0 Sep 30 17:36:49 crc kubenswrapper[4772]: I0930 17:36:49.586417 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx22r" event={"ID":"65ee69d2-5a95-4286-8773-334ef2f2d95a","Type":"ContainerDied","Data":"089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2"} Sep 30 17:36:50 crc kubenswrapper[4772]: I0930 17:36:50.598917 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx22r" event={"ID":"65ee69d2-5a95-4286-8773-334ef2f2d95a","Type":"ContainerStarted","Data":"219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c"} Sep 30 17:36:50 crc kubenswrapper[4772]: I0930 17:36:50.620307 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gx22r" podStartSLOduration=2.834229267 podStartE2EDuration="5.620286774s" podCreationTimestamp="2025-09-30 17:36:45 +0000 UTC" firstStartedPulling="2025-09-30 17:36:47.568593192 +0000 UTC m=+2108.475606023" lastFinishedPulling="2025-09-30 17:36:50.354650709 +0000 UTC m=+2111.261663530" observedRunningTime="2025-09-30 17:36:50.615879766 +0000 UTC m=+2111.522892607" watchObservedRunningTime="2025-09-30 17:36:50.620286774 +0000 UTC m=+2111.527299615" Sep 30 17:36:55 crc kubenswrapper[4772]: I0930 17:36:55.779669 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:55 crc kubenswrapper[4772]: I0930 17:36:55.780604 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:55 crc kubenswrapper[4772]: I0930 17:36:55.837515 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:56 crc kubenswrapper[4772]: I0930 17:36:56.695583 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:56 crc kubenswrapper[4772]: I0930 17:36:56.746504 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gx22r"] Sep 30 17:36:58 crc kubenswrapper[4772]: I0930 17:36:58.665887 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gx22r" podUID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerName="registry-server" containerID="cri-o://219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c" gracePeriod=2 Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.138671 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.203080 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-catalog-content\") pod \"65ee69d2-5a95-4286-8773-334ef2f2d95a\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.203619 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s55hx\" (UniqueName: \"kubernetes.io/projected/65ee69d2-5a95-4286-8773-334ef2f2d95a-kube-api-access-s55hx\") pod \"65ee69d2-5a95-4286-8773-334ef2f2d95a\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.203682 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-utilities\") pod \"65ee69d2-5a95-4286-8773-334ef2f2d95a\" (UID: \"65ee69d2-5a95-4286-8773-334ef2f2d95a\") " Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.204623 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-utilities" (OuterVolumeSpecName: "utilities") pod "65ee69d2-5a95-4286-8773-334ef2f2d95a" (UID: "65ee69d2-5a95-4286-8773-334ef2f2d95a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.211526 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ee69d2-5a95-4286-8773-334ef2f2d95a-kube-api-access-s55hx" (OuterVolumeSpecName: "kube-api-access-s55hx") pod "65ee69d2-5a95-4286-8773-334ef2f2d95a" (UID: "65ee69d2-5a95-4286-8773-334ef2f2d95a"). InnerVolumeSpecName "kube-api-access-s55hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.254590 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65ee69d2-5a95-4286-8773-334ef2f2d95a" (UID: "65ee69d2-5a95-4286-8773-334ef2f2d95a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.306397 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s55hx\" (UniqueName: \"kubernetes.io/projected/65ee69d2-5a95-4286-8773-334ef2f2d95a-kube-api-access-s55hx\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.306452 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.306467 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ee69d2-5a95-4286-8773-334ef2f2d95a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.683119 4772 generic.go:334] "Generic (PLEG): container finished" podID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerID="219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c" exitCode=0 Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.683180 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx22r" event={"ID":"65ee69d2-5a95-4286-8773-334ef2f2d95a","Type":"ContainerDied","Data":"219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c"} Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.683215 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx22r" event={"ID":"65ee69d2-5a95-4286-8773-334ef2f2d95a","Type":"ContainerDied","Data":"73a82c160fbbc169c0da8b93884db4eeb3e22d334830b16db9b9ab1e4e565494"} Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.683236 4772 scope.go:117] "RemoveContainer" containerID="219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.683180 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gx22r" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.712254 4772 scope.go:117] "RemoveContainer" containerID="089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.719385 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gx22r"] Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.731281 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gx22r"] Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.738654 4772 scope.go:117] "RemoveContainer" containerID="59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.778774 4772 scope.go:117] "RemoveContainer" containerID="219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c" Sep 30 17:36:59 crc kubenswrapper[4772]: E0930 17:36:59.779359 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c\": container with ID starting with 219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c not found: ID does not exist" containerID="219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.779402 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c"} err="failed to get container status \"219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c\": rpc error: code = NotFound desc = could not find container \"219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c\": container with ID starting with 219b787968c1b1a300f3b9739cb486e1762f381ddbbf134c501a55ad95d95c3c not found: ID does not exist" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.779428 4772 scope.go:117] "RemoveContainer" containerID="089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2" Sep 30 17:36:59 crc kubenswrapper[4772]: E0930 17:36:59.780120 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2\": container with ID starting with 089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2 not found: ID does not exist" containerID="089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.780154 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2"} err="failed to get container status \"089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2\": rpc error: code = NotFound desc = could not find container \"089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2\": container with ID starting with 089f98ce753ecebb5bde702d0b4fa175911e343b8cc84cdb946e7c491001a4e2 not found: ID does not exist" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.780175 4772 scope.go:117] "RemoveContainer" containerID="59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7" Sep 30 17:36:59 crc kubenswrapper[4772]: E0930 17:36:59.780532 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7\": container with ID starting with 59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7 not found: ID does not exist" containerID="59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.780603 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7"} err="failed to get container status \"59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7\": rpc error: code = NotFound desc = could not find container \"59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7\": container with ID starting with 59135331a1ae781c15bbabd89478aea47a44da08aa7bcad9090ec81a16933dc7 not found: ID does not exist" Sep 30 17:36:59 crc kubenswrapper[4772]: I0930 17:36:59.911324 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ee69d2-5a95-4286-8773-334ef2f2d95a" path="/var/lib/kubelet/pods/65ee69d2-5a95-4286-8773-334ef2f2d95a/volumes" Sep 30 17:37:08 crc kubenswrapper[4772]: I0930 17:37:08.655395 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:37:08 crc kubenswrapper[4772]: I0930 17:37:08.656229 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:37:08 crc kubenswrapper[4772]: I0930 17:37:08.656287 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:37:08 crc kubenswrapper[4772]: I0930 17:37:08.657204 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa262c38b488c0fff98a713c814bcc2a49aaeb671dd1e2237d6106aaff892d76"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:37:08 crc kubenswrapper[4772]: I0930 17:37:08.657269 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://aa262c38b488c0fff98a713c814bcc2a49aaeb671dd1e2237d6106aaff892d76" gracePeriod=600 Sep 30 17:37:09 crc kubenswrapper[4772]: I0930 17:37:09.780105 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="aa262c38b488c0fff98a713c814bcc2a49aaeb671dd1e2237d6106aaff892d76" exitCode=0 Sep 30 17:37:09 crc kubenswrapper[4772]: I0930 17:37:09.780190 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"aa262c38b488c0fff98a713c814bcc2a49aaeb671dd1e2237d6106aaff892d76"} Sep 30 17:37:09 crc kubenswrapper[4772]: I0930 17:37:09.781087 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f"} Sep 30 17:37:09 crc kubenswrapper[4772]: I0930 17:37:09.781153 4772 scope.go:117] "RemoveContainer" containerID="c3fce071cd26cc5695a4b61b75ef7003b97d094f1f7e57a2fca51ae131cdddef" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.263400 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l2wxl"] Sep 30 17:37:14 crc kubenswrapper[4772]: E0930 17:37:14.265292 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerName="extract-content" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.265321 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerName="extract-content" Sep 30 17:37:14 crc kubenswrapper[4772]: E0930 17:37:14.265339 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerName="extract-utilities" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.265348 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerName="extract-utilities" Sep 30 17:37:14 crc kubenswrapper[4772]: E0930 17:37:14.265371 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerName="registry-server" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.265379 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerName="registry-server" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.265653 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ee69d2-5a95-4286-8773-334ef2f2d95a" containerName="registry-server" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.267646 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.278935 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2wxl"] Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.342745 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-catalog-content\") pod \"redhat-marketplace-l2wxl\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.342831 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-679vf\" (UniqueName: \"kubernetes.io/projected/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-kube-api-access-679vf\") pod \"redhat-marketplace-l2wxl\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.342945 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-utilities\") pod \"redhat-marketplace-l2wxl\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.443707 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-utilities\") pod \"redhat-marketplace-l2wxl\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.443829 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-catalog-content\") pod \"redhat-marketplace-l2wxl\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.443897 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-679vf\" (UniqueName: \"kubernetes.io/projected/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-kube-api-access-679vf\") pod \"redhat-marketplace-l2wxl\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.444462 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-catalog-content\") pod \"redhat-marketplace-l2wxl\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.444763 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-utilities\") pod \"redhat-marketplace-l2wxl\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.471306 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-679vf\" (UniqueName: \"kubernetes.io/projected/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-kube-api-access-679vf\") pod \"redhat-marketplace-l2wxl\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:14 crc kubenswrapper[4772]: I0930 17:37:14.600998 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:15 crc kubenswrapper[4772]: I0930 17:37:15.118243 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2wxl"] Sep 30 17:37:15 crc kubenswrapper[4772]: I0930 17:37:15.849126 4772 generic.go:334] "Generic (PLEG): container finished" podID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerID="5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90" exitCode=0 Sep 30 17:37:15 crc kubenswrapper[4772]: I0930 17:37:15.849211 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2wxl" event={"ID":"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d","Type":"ContainerDied","Data":"5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90"} Sep 30 17:37:15 crc kubenswrapper[4772]: I0930 17:37:15.849564 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2wxl" event={"ID":"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d","Type":"ContainerStarted","Data":"c59bf2451e818336ab09b9dd99099b566d63fb22e4c006012ae4b1d04aa907da"} Sep 30 17:37:17 crc kubenswrapper[4772]: I0930 17:37:17.875334 4772 generic.go:334] "Generic (PLEG): container finished" podID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerID="d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec" exitCode=0 Sep 30 17:37:17 crc kubenswrapper[4772]: I0930 17:37:17.875603 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2wxl" event={"ID":"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d","Type":"ContainerDied","Data":"d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec"} Sep 30 17:37:18 crc kubenswrapper[4772]: I0930 17:37:18.887046 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2wxl" event={"ID":"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d","Type":"ContainerStarted","Data":"6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34"} Sep 30 17:37:18 crc kubenswrapper[4772]: I0930 17:37:18.915204 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l2wxl" podStartSLOduration=2.515641932 podStartE2EDuration="4.915178032s" podCreationTimestamp="2025-09-30 17:37:14 +0000 UTC" firstStartedPulling="2025-09-30 17:37:15.851102797 +0000 UTC m=+2136.758115628" lastFinishedPulling="2025-09-30 17:37:18.250638897 +0000 UTC m=+2139.157651728" observedRunningTime="2025-09-30 17:37:18.907472645 +0000 UTC m=+2139.814485496" watchObservedRunningTime="2025-09-30 17:37:18.915178032 +0000 UTC m=+2139.822190863" Sep 30 17:37:24 crc kubenswrapper[4772]: I0930 17:37:24.601858 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:24 crc kubenswrapper[4772]: I0930 17:37:24.603376 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:24 crc kubenswrapper[4772]: I0930 17:37:24.645912 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:24 crc kubenswrapper[4772]: I0930 17:37:24.991933 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:25 crc kubenswrapper[4772]: I0930 17:37:25.045566 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2wxl"] Sep 30 17:37:26 crc kubenswrapper[4772]: I0930 17:37:26.956763 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l2wxl" podUID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerName="registry-server" containerID="cri-o://6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34" gracePeriod=2 Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.467884 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.603647 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-utilities\") pod \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.603758 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-catalog-content\") pod \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.604102 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-679vf\" (UniqueName: \"kubernetes.io/projected/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-kube-api-access-679vf\") pod \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\" (UID: \"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d\") " Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.608884 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-utilities" (OuterVolumeSpecName: "utilities") pod "ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" (UID: "ba8ee6cc-4e8c-48be-afbc-ab648bcb786d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.610701 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-kube-api-access-679vf" (OuterVolumeSpecName: "kube-api-access-679vf") pod "ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" (UID: "ba8ee6cc-4e8c-48be-afbc-ab648bcb786d"). InnerVolumeSpecName "kube-api-access-679vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.620575 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" (UID: "ba8ee6cc-4e8c-48be-afbc-ab648bcb786d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.707417 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-679vf\" (UniqueName: \"kubernetes.io/projected/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-kube-api-access-679vf\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.707887 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.708108 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.984458 4772 generic.go:334] "Generic (PLEG): container finished" podID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerID="6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34" exitCode=0 Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.984512 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2wxl" event={"ID":"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d","Type":"ContainerDied","Data":"6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34"} Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.984543 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2wxl" event={"ID":"ba8ee6cc-4e8c-48be-afbc-ab648bcb786d","Type":"ContainerDied","Data":"c59bf2451e818336ab09b9dd99099b566d63fb22e4c006012ae4b1d04aa907da"} Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.984572 4772 scope.go:117] "RemoveContainer" containerID="6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34" Sep 30 17:37:27 crc kubenswrapper[4772]: I0930 17:37:27.984960 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2wxl" Sep 30 17:37:28 crc kubenswrapper[4772]: I0930 17:37:28.015561 4772 scope.go:117] "RemoveContainer" containerID="d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec" Sep 30 17:37:28 crc kubenswrapper[4772]: I0930 17:37:28.016571 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2wxl"] Sep 30 17:37:28 crc kubenswrapper[4772]: I0930 17:37:28.024861 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2wxl"] Sep 30 17:37:28 crc kubenswrapper[4772]: I0930 17:37:28.036119 4772 scope.go:117] "RemoveContainer" containerID="5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90" Sep 30 17:37:28 crc kubenswrapper[4772]: I0930 17:37:28.081226 4772 scope.go:117] "RemoveContainer" containerID="6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34" Sep 30 17:37:28 crc kubenswrapper[4772]: E0930 17:37:28.081568 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34\": container with ID starting with 6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34 not found: ID does not exist" containerID="6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34" Sep 30 17:37:28 crc kubenswrapper[4772]: I0930 17:37:28.081603 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34"} err="failed to get container status \"6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34\": rpc error: code = NotFound desc = could not find container \"6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34\": container with ID starting with 6f5222696649489410c2ae02817c0ec06c97fe68669756b5b6d3f9043ee99d34 not found: ID does not exist" Sep 30 17:37:28 crc kubenswrapper[4772]: I0930 17:37:28.081629 4772 scope.go:117] "RemoveContainer" containerID="d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec" Sep 30 17:37:28 crc kubenswrapper[4772]: E0930 17:37:28.081863 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec\": container with ID starting with d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec not found: ID does not exist" containerID="d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec" Sep 30 17:37:28 crc kubenswrapper[4772]: I0930 17:37:28.081890 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec"} err="failed to get container status \"d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec\": rpc error: code = NotFound desc = could not find container \"d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec\": container with ID starting with d537da1f73c478bb65c23ad71568fbe417b00c8fddbed8c9c5d02fbb4ca80aec not found: ID does not exist" Sep 30 17:37:28 crc kubenswrapper[4772]: I0930 17:37:28.081907 4772 scope.go:117] "RemoveContainer" containerID="5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90" Sep 30 17:37:28 crc kubenswrapper[4772]: E0930 17:37:28.082290 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90\": container with ID starting with 5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90 not found: ID does not exist" containerID="5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90" Sep 30 17:37:28 crc kubenswrapper[4772]: I0930 17:37:28.082313 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90"} err="failed to get container status \"5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90\": rpc error: code = NotFound desc = could not find container \"5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90\": container with ID starting with 5aec5cc3ee6b0e578995651fae99d2886183905ece7a8e0f447ea9d2322e8e90 not found: ID does not exist" Sep 30 17:37:29 crc kubenswrapper[4772]: I0930 17:37:29.909908 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" path="/var/lib/kubelet/pods/ba8ee6cc-4e8c-48be-afbc-ab648bcb786d/volumes" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.745088 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7q8jm"] Sep 30 17:38:37 crc kubenswrapper[4772]: E0930 17:38:37.746550 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerName="extract-utilities" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.746567 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerName="extract-utilities" Sep 30 17:38:37 crc kubenswrapper[4772]: E0930 17:38:37.746588 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerName="registry-server" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.746594 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerName="registry-server" Sep 30 17:38:37 crc kubenswrapper[4772]: E0930 17:38:37.746635 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerName="extract-content" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.746642 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerName="extract-content" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.746880 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8ee6cc-4e8c-48be-afbc-ab648bcb786d" containerName="registry-server" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.748837 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.753177 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7q8jm"] Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.887470 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-utilities\") pod \"certified-operators-7q8jm\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.887625 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-catalog-content\") pod \"certified-operators-7q8jm\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.887651 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9spfw\" (UniqueName: \"kubernetes.io/projected/628a7ea9-b0da-4def-8397-bc6177b4e87e-kube-api-access-9spfw\") pod \"certified-operators-7q8jm\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.989732 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-catalog-content\") pod \"certified-operators-7q8jm\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.989829 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9spfw\" (UniqueName: \"kubernetes.io/projected/628a7ea9-b0da-4def-8397-bc6177b4e87e-kube-api-access-9spfw\") pod \"certified-operators-7q8jm\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.989933 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-utilities\") pod \"certified-operators-7q8jm\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.990603 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-catalog-content\") pod \"certified-operators-7q8jm\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:37 crc kubenswrapper[4772]: I0930 17:38:37.991554 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-utilities\") pod \"certified-operators-7q8jm\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:38 crc kubenswrapper[4772]: I0930 17:38:38.016148 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9spfw\" (UniqueName: \"kubernetes.io/projected/628a7ea9-b0da-4def-8397-bc6177b4e87e-kube-api-access-9spfw\") pod \"certified-operators-7q8jm\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:38 crc kubenswrapper[4772]: I0930 17:38:38.079536 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:38 crc kubenswrapper[4772]: I0930 17:38:38.624813 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7q8jm"] Sep 30 17:38:39 crc kubenswrapper[4772]: I0930 17:38:39.650491 4772 generic.go:334] "Generic (PLEG): container finished" podID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerID="8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3" exitCode=0 Sep 30 17:38:39 crc kubenswrapper[4772]: I0930 17:38:39.650590 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q8jm" event={"ID":"628a7ea9-b0da-4def-8397-bc6177b4e87e","Type":"ContainerDied","Data":"8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3"} Sep 30 17:38:39 crc kubenswrapper[4772]: I0930 17:38:39.650723 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q8jm" event={"ID":"628a7ea9-b0da-4def-8397-bc6177b4e87e","Type":"ContainerStarted","Data":"09a8b6b025ee26056fedb488bc210b131c9ec069d7698de869aecb8477af7954"} Sep 30 17:38:41 crc kubenswrapper[4772]: I0930 17:38:41.671988 4772 generic.go:334] "Generic (PLEG): container finished" podID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerID="76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e" exitCode=0 Sep 30 17:38:41 crc kubenswrapper[4772]: I0930 17:38:41.672043 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q8jm" event={"ID":"628a7ea9-b0da-4def-8397-bc6177b4e87e","Type":"ContainerDied","Data":"76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e"} Sep 30 17:38:44 crc kubenswrapper[4772]: I0930 17:38:44.701374 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q8jm" event={"ID":"628a7ea9-b0da-4def-8397-bc6177b4e87e","Type":"ContainerStarted","Data":"272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c"} Sep 30 17:38:44 crc kubenswrapper[4772]: I0930 17:38:44.726423 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7q8jm" podStartSLOduration=3.84018238 podStartE2EDuration="7.726393945s" podCreationTimestamp="2025-09-30 17:38:37 +0000 UTC" firstStartedPulling="2025-09-30 17:38:39.663463333 +0000 UTC m=+2220.570476164" lastFinishedPulling="2025-09-30 17:38:43.549674898 +0000 UTC m=+2224.456687729" observedRunningTime="2025-09-30 17:38:44.722867411 +0000 UTC m=+2225.629880282" watchObservedRunningTime="2025-09-30 17:38:44.726393945 +0000 UTC m=+2225.633406786" Sep 30 17:38:48 crc kubenswrapper[4772]: I0930 17:38:48.080176 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:48 crc kubenswrapper[4772]: I0930 17:38:48.080772 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:48 crc kubenswrapper[4772]: I0930 17:38:48.129111 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:48 crc kubenswrapper[4772]: I0930 17:38:48.777361 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:48 crc kubenswrapper[4772]: I0930 17:38:48.832578 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7q8jm"] Sep 30 17:38:50 crc kubenswrapper[4772]: I0930 17:38:50.750804 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7q8jm" podUID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerName="registry-server" containerID="cri-o://272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c" gracePeriod=2 Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.188331 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.262744 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-utilities\") pod \"628a7ea9-b0da-4def-8397-bc6177b4e87e\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.263273 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9spfw\" (UniqueName: \"kubernetes.io/projected/628a7ea9-b0da-4def-8397-bc6177b4e87e-kube-api-access-9spfw\") pod \"628a7ea9-b0da-4def-8397-bc6177b4e87e\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.264286 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-utilities" (OuterVolumeSpecName: "utilities") pod "628a7ea9-b0da-4def-8397-bc6177b4e87e" (UID: "628a7ea9-b0da-4def-8397-bc6177b4e87e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.265870 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-catalog-content\") pod \"628a7ea9-b0da-4def-8397-bc6177b4e87e\" (UID: \"628a7ea9-b0da-4def-8397-bc6177b4e87e\") " Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.266787 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.273117 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/628a7ea9-b0da-4def-8397-bc6177b4e87e-kube-api-access-9spfw" (OuterVolumeSpecName: "kube-api-access-9spfw") pod "628a7ea9-b0da-4def-8397-bc6177b4e87e" (UID: "628a7ea9-b0da-4def-8397-bc6177b4e87e"). InnerVolumeSpecName "kube-api-access-9spfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.316863 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "628a7ea9-b0da-4def-8397-bc6177b4e87e" (UID: "628a7ea9-b0da-4def-8397-bc6177b4e87e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.369017 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9spfw\" (UniqueName: \"kubernetes.io/projected/628a7ea9-b0da-4def-8397-bc6177b4e87e-kube-api-access-9spfw\") on node \"crc\" DevicePath \"\"" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.369086 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628a7ea9-b0da-4def-8397-bc6177b4e87e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.764369 4772 generic.go:334] "Generic (PLEG): container finished" podID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerID="272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c" exitCode=0 Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.764426 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q8jm" event={"ID":"628a7ea9-b0da-4def-8397-bc6177b4e87e","Type":"ContainerDied","Data":"272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c"} Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.764445 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7q8jm" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.764471 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q8jm" event={"ID":"628a7ea9-b0da-4def-8397-bc6177b4e87e","Type":"ContainerDied","Data":"09a8b6b025ee26056fedb488bc210b131c9ec069d7698de869aecb8477af7954"} Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.764496 4772 scope.go:117] "RemoveContainer" containerID="272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.791344 4772 scope.go:117] "RemoveContainer" containerID="76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.809544 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7q8jm"] Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.818975 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7q8jm"] Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.831990 4772 scope.go:117] "RemoveContainer" containerID="8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.863739 4772 scope.go:117] "RemoveContainer" containerID="272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c" Sep 30 17:38:51 crc kubenswrapper[4772]: E0930 17:38:51.864373 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c\": container with ID starting with 272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c not found: ID does not exist" containerID="272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.864431 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c"} err="failed to get container status \"272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c\": rpc error: code = NotFound desc = could not find container \"272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c\": container with ID starting with 272b944615a356846915dd6491d8a9996c8cb2af5f8aff49aca358543f76080c not found: ID does not exist" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.864466 4772 scope.go:117] "RemoveContainer" containerID="76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e" Sep 30 17:38:51 crc kubenswrapper[4772]: E0930 17:38:51.864879 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e\": container with ID starting with 76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e not found: ID does not exist" containerID="76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.864943 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e"} err="failed to get container status \"76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e\": rpc error: code = NotFound desc = could not find container \"76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e\": container with ID starting with 76b671d858acceac0a6d7295bff369514d203ccf0af1793e61961bd2a9ee550e not found: ID does not exist" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.864968 4772 scope.go:117] "RemoveContainer" containerID="8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3" Sep 30 17:38:51 crc kubenswrapper[4772]: E0930 17:38:51.865287 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3\": container with ID starting with 8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3 not found: ID does not exist" containerID="8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.865853 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3"} err="failed to get container status \"8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3\": rpc error: code = NotFound desc = could not find container \"8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3\": container with ID starting with 8cac8e009bca4405c09b05d441f0897547010594d15f627792cefa1cf3e9b5a3 not found: ID does not exist" Sep 30 17:38:51 crc kubenswrapper[4772]: I0930 17:38:51.920572 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="628a7ea9-b0da-4def-8397-bc6177b4e87e" path="/var/lib/kubelet/pods/628a7ea9-b0da-4def-8397-bc6177b4e87e/volumes" Sep 30 17:39:08 crc kubenswrapper[4772]: I0930 17:39:08.655536 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:39:08 crc kubenswrapper[4772]: I0930 17:39:08.655997 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:39:27 crc kubenswrapper[4772]: I0930 17:39:27.109784 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-684cbd44c-xstzf" podUID="d878293c-0383-4575-95cb-1062bcb4634e" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 502" Sep 30 17:39:38 crc kubenswrapper[4772]: I0930 17:39:38.654957 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:39:38 crc kubenswrapper[4772]: I0930 17:39:38.655575 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.323113 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7mj79"] Sep 30 17:40:04 crc kubenswrapper[4772]: E0930 17:40:04.324073 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerName="extract-content" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.324087 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerName="extract-content" Sep 30 17:40:04 crc kubenswrapper[4772]: E0930 17:40:04.324105 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerName="extract-utilities" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.324111 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerName="extract-utilities" Sep 30 17:40:04 crc kubenswrapper[4772]: E0930 17:40:04.324129 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerName="registry-server" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.324139 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerName="registry-server" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.324332 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="628a7ea9-b0da-4def-8397-bc6177b4e87e" containerName="registry-server" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.325799 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.341766 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7mj79"] Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.520805 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-catalog-content\") pod \"redhat-operators-7mj79\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.520858 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkg9w\" (UniqueName: \"kubernetes.io/projected/b5c0967e-497d-4496-bbd5-32bd6045b6ae-kube-api-access-jkg9w\") pod \"redhat-operators-7mj79\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.520877 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-utilities\") pod \"redhat-operators-7mj79\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.623864 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-catalog-content\") pod \"redhat-operators-7mj79\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.623921 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9w\" (UniqueName: \"kubernetes.io/projected/b5c0967e-497d-4496-bbd5-32bd6045b6ae-kube-api-access-jkg9w\") pod \"redhat-operators-7mj79\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.623942 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-utilities\") pod \"redhat-operators-7mj79\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.624522 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-catalog-content\") pod \"redhat-operators-7mj79\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.624594 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-utilities\") pod \"redhat-operators-7mj79\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.643001 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkg9w\" (UniqueName: \"kubernetes.io/projected/b5c0967e-497d-4496-bbd5-32bd6045b6ae-kube-api-access-jkg9w\") pod \"redhat-operators-7mj79\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:04 crc kubenswrapper[4772]: I0930 17:40:04.653113 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:05 crc kubenswrapper[4772]: I0930 17:40:05.107311 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7mj79"] Sep 30 17:40:05 crc kubenswrapper[4772]: I0930 17:40:05.433169 4772 generic.go:334] "Generic (PLEG): container finished" podID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerID="0f8528d87d2a95adf03b172d11eb33df65fb0fdceb4192ec6b0519d30992cc56" exitCode=0 Sep 30 17:40:05 crc kubenswrapper[4772]: I0930 17:40:05.433224 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj79" event={"ID":"b5c0967e-497d-4496-bbd5-32bd6045b6ae","Type":"ContainerDied","Data":"0f8528d87d2a95adf03b172d11eb33df65fb0fdceb4192ec6b0519d30992cc56"} Sep 30 17:40:05 crc kubenswrapper[4772]: I0930 17:40:05.433260 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj79" event={"ID":"b5c0967e-497d-4496-bbd5-32bd6045b6ae","Type":"ContainerStarted","Data":"95efdb7387259880f83aa606697003ab7f5e38bd1fa6ef6b7f798775fd1e696f"} Sep 30 17:40:05 crc kubenswrapper[4772]: I0930 17:40:05.439003 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:40:07 crc kubenswrapper[4772]: I0930 17:40:07.453945 4772 generic.go:334] "Generic (PLEG): container finished" podID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerID="c7d15c9fa1bacc2b355c9c1ef5c9532e25b4393d111121b4081ebd3180fa331b" exitCode=0 Sep 30 17:40:07 crc kubenswrapper[4772]: I0930 17:40:07.454035 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj79" event={"ID":"b5c0967e-497d-4496-bbd5-32bd6045b6ae","Type":"ContainerDied","Data":"c7d15c9fa1bacc2b355c9c1ef5c9532e25b4393d111121b4081ebd3180fa331b"} Sep 30 17:40:08 crc kubenswrapper[4772]: I0930 17:40:08.468473 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj79" event={"ID":"b5c0967e-497d-4496-bbd5-32bd6045b6ae","Type":"ContainerStarted","Data":"d158fd3ea80c39737f836daf9a57d5be235d3d06b30158d4b2393142131552f3"} Sep 30 17:40:08 crc kubenswrapper[4772]: I0930 17:40:08.487902 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7mj79" podStartSLOduration=2.016781324 podStartE2EDuration="4.487887523s" podCreationTimestamp="2025-09-30 17:40:04 +0000 UTC" firstStartedPulling="2025-09-30 17:40:05.437024649 +0000 UTC m=+2306.344037480" lastFinishedPulling="2025-09-30 17:40:07.908130848 +0000 UTC m=+2308.815143679" observedRunningTime="2025-09-30 17:40:08.485824568 +0000 UTC m=+2309.392837409" watchObservedRunningTime="2025-09-30 17:40:08.487887523 +0000 UTC m=+2309.394900354" Sep 30 17:40:08 crc kubenswrapper[4772]: I0930 17:40:08.655629 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:40:08 crc kubenswrapper[4772]: I0930 17:40:08.655699 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:40:08 crc kubenswrapper[4772]: I0930 17:40:08.655745 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:40:08 crc kubenswrapper[4772]: I0930 17:40:08.656544 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:40:08 crc kubenswrapper[4772]: I0930 17:40:08.656610 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" gracePeriod=600 Sep 30 17:40:08 crc kubenswrapper[4772]: E0930 17:40:08.783726 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:40:09 crc kubenswrapper[4772]: I0930 17:40:09.478664 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" exitCode=0 Sep 30 17:40:09 crc kubenswrapper[4772]: I0930 17:40:09.479077 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f"} Sep 30 17:40:09 crc kubenswrapper[4772]: I0930 17:40:09.479133 4772 scope.go:117] "RemoveContainer" containerID="aa262c38b488c0fff98a713c814bcc2a49aaeb671dd1e2237d6106aaff892d76" Sep 30 17:40:09 crc kubenswrapper[4772]: I0930 17:40:09.479950 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:40:09 crc kubenswrapper[4772]: E0930 17:40:09.480568 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:40:14 crc kubenswrapper[4772]: I0930 17:40:14.653286 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:14 crc kubenswrapper[4772]: I0930 17:40:14.655344 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:14 crc kubenswrapper[4772]: I0930 17:40:14.705171 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:15 crc kubenswrapper[4772]: I0930 17:40:15.579935 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:16 crc kubenswrapper[4772]: I0930 17:40:16.080960 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7mj79"] Sep 30 17:40:17 crc kubenswrapper[4772]: I0930 17:40:17.550556 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7mj79" podUID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerName="registry-server" containerID="cri-o://d158fd3ea80c39737f836daf9a57d5be235d3d06b30158d4b2393142131552f3" gracePeriod=2 Sep 30 17:40:19 crc kubenswrapper[4772]: I0930 17:40:19.569511 4772 generic.go:334] "Generic (PLEG): container finished" podID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerID="d158fd3ea80c39737f836daf9a57d5be235d3d06b30158d4b2393142131552f3" exitCode=0 Sep 30 17:40:19 crc kubenswrapper[4772]: I0930 17:40:19.569571 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj79" event={"ID":"b5c0967e-497d-4496-bbd5-32bd6045b6ae","Type":"ContainerDied","Data":"d158fd3ea80c39737f836daf9a57d5be235d3d06b30158d4b2393142131552f3"} Sep 30 17:40:19 crc kubenswrapper[4772]: I0930 17:40:19.931818 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.037739 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-catalog-content\") pod \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.037952 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-utilities\") pod \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.038000 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkg9w\" (UniqueName: \"kubernetes.io/projected/b5c0967e-497d-4496-bbd5-32bd6045b6ae-kube-api-access-jkg9w\") pod \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\" (UID: \"b5c0967e-497d-4496-bbd5-32bd6045b6ae\") " Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.039212 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-utilities" (OuterVolumeSpecName: "utilities") pod "b5c0967e-497d-4496-bbd5-32bd6045b6ae" (UID: "b5c0967e-497d-4496-bbd5-32bd6045b6ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.047457 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c0967e-497d-4496-bbd5-32bd6045b6ae-kube-api-access-jkg9w" (OuterVolumeSpecName: "kube-api-access-jkg9w") pod "b5c0967e-497d-4496-bbd5-32bd6045b6ae" (UID: "b5c0967e-497d-4496-bbd5-32bd6045b6ae"). InnerVolumeSpecName "kube-api-access-jkg9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.125485 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5c0967e-497d-4496-bbd5-32bd6045b6ae" (UID: "b5c0967e-497d-4496-bbd5-32bd6045b6ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.140044 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.140100 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkg9w\" (UniqueName: \"kubernetes.io/projected/b5c0967e-497d-4496-bbd5-32bd6045b6ae-kube-api-access-jkg9w\") on node \"crc\" DevicePath \"\"" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.140122 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c0967e-497d-4496-bbd5-32bd6045b6ae-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.581789 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj79" event={"ID":"b5c0967e-497d-4496-bbd5-32bd6045b6ae","Type":"ContainerDied","Data":"95efdb7387259880f83aa606697003ab7f5e38bd1fa6ef6b7f798775fd1e696f"} Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.581846 4772 scope.go:117] "RemoveContainer" containerID="d158fd3ea80c39737f836daf9a57d5be235d3d06b30158d4b2393142131552f3" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.581910 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mj79" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.613174 4772 scope.go:117] "RemoveContainer" containerID="c7d15c9fa1bacc2b355c9c1ef5c9532e25b4393d111121b4081ebd3180fa331b" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.641554 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7mj79"] Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.647963 4772 scope.go:117] "RemoveContainer" containerID="0f8528d87d2a95adf03b172d11eb33df65fb0fdceb4192ec6b0519d30992cc56" Sep 30 17:40:20 crc kubenswrapper[4772]: I0930 17:40:20.653125 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7mj79"] Sep 30 17:40:21 crc kubenswrapper[4772]: I0930 17:40:21.898526 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:40:21 crc kubenswrapper[4772]: E0930 17:40:21.899145 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:40:21 crc kubenswrapper[4772]: I0930 17:40:21.912197 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" path="/var/lib/kubelet/pods/b5c0967e-497d-4496-bbd5-32bd6045b6ae/volumes" Sep 30 17:40:35 crc kubenswrapper[4772]: I0930 17:40:35.898498 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:40:35 crc kubenswrapper[4772]: E0930 17:40:35.900335 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:40:48 crc kubenswrapper[4772]: I0930 17:40:48.901115 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:40:48 crc kubenswrapper[4772]: E0930 17:40:48.901860 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.585761 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.594001 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-9c4wl"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.601659 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.609505 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.618686 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p94c6"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.624913 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.631919 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.638252 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.644236 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bplkf"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.650883 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7vzz4"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.657699 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.664748 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rw9px"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.672376 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh8pc"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.679275 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.705167 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.712834 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p94c6"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.722215 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.734357 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-89s5l"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.753880 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x64zl"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.761658 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2qs7k"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.775020 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t58j5"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.784740 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-56tf9"] Sep 30 17:41:00 crc kubenswrapper[4772]: I0930 17:41:00.898690 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:41:00 crc kubenswrapper[4772]: E0930 17:41:00.898929 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.910200 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a025bcd-8420-43af-b3ef-f6c3b1c5941d" path="/var/lib/kubelet/pods/1a025bcd-8420-43af-b3ef-f6c3b1c5941d/volumes" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.911402 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527cfb73-f3fa-4746-8174-788942e65624" path="/var/lib/kubelet/pods/527cfb73-f3fa-4746-8174-788942e65624/volumes" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.911930 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59fe2017-52c6-4086-ac96-73f822eb744d" path="/var/lib/kubelet/pods/59fe2017-52c6-4086-ac96-73f822eb744d/volumes" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.912521 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a109e03b-45b5-4c40-91d5-d9719de7cce8" path="/var/lib/kubelet/pods/a109e03b-45b5-4c40-91d5-d9719de7cce8/volumes" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.915787 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="addb4fb1-d812-4472-a08e-742c97c9b6d2" path="/var/lib/kubelet/pods/addb4fb1-d812-4472-a08e-742c97c9b6d2/volumes" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.916329 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29acfcb-d20b-4be6-a22b-e0e0bc5deae0" path="/var/lib/kubelet/pods/b29acfcb-d20b-4be6-a22b-e0e0bc5deae0/volumes" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.916873 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0" path="/var/lib/kubelet/pods/bd6a5ec7-dc1b-4ef6-816d-308b4893a7d0/volumes" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.917426 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf660345-e15a-46d9-b1f2-8cd460d61a9d" path="/var/lib/kubelet/pods/bf660345-e15a-46d9-b1f2-8cd460d61a9d/volumes" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.917935 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae3e1d5-7f5f-468b-8b80-a8d0c012d017" path="/var/lib/kubelet/pods/cae3e1d5-7f5f-468b-8b80-a8d0c012d017/volumes" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.918469 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9" path="/var/lib/kubelet/pods/d6ddfb38-85bf-4bb9-a53b-6d33d6862fd9/volumes" Sep 30 17:41:01 crc kubenswrapper[4772]: I0930 17:41:01.919008 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fecfdbbc-226d-4daf-b162-43aeefc9d100" path="/var/lib/kubelet/pods/fecfdbbc-226d-4daf-b162-43aeefc9d100/volumes" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.524264 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v"] Sep 30 17:41:13 crc kubenswrapper[4772]: E0930 17:41:13.525252 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerName="extract-utilities" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.525270 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerName="extract-utilities" Sep 30 17:41:13 crc kubenswrapper[4772]: E0930 17:41:13.525323 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerName="extract-content" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.525331 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerName="extract-content" Sep 30 17:41:13 crc kubenswrapper[4772]: E0930 17:41:13.525341 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerName="registry-server" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.525348 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerName="registry-server" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.525555 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c0967e-497d-4496-bbd5-32bd6045b6ae" containerName="registry-server" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.526488 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.529440 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.529457 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.529908 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.530087 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.536823 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.547428 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v"] Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.563229 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.563492 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq6zj\" (UniqueName: \"kubernetes.io/projected/489fcf90-05fb-484f-9cd9-6b403023229a-kube-api-access-lq6zj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.563615 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.563694 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.563821 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.665564 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.665890 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq6zj\" (UniqueName: \"kubernetes.io/projected/489fcf90-05fb-484f-9cd9-6b403023229a-kube-api-access-lq6zj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.665946 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.665969 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.666023 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.671578 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.671791 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.672568 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.673084 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.681696 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq6zj\" (UniqueName: \"kubernetes.io/projected/489fcf90-05fb-484f-9cd9-6b403023229a-kube-api-access-lq6zj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:13 crc kubenswrapper[4772]: I0930 17:41:13.846608 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:14 crc kubenswrapper[4772]: I0930 17:41:14.409508 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v"] Sep 30 17:41:15 crc kubenswrapper[4772]: I0930 17:41:15.085998 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" event={"ID":"489fcf90-05fb-484f-9cd9-6b403023229a","Type":"ContainerStarted","Data":"4e591778038648c64cac4c80a3a435a55549ff7527491e5b126f588b5303add3"} Sep 30 17:41:15 crc kubenswrapper[4772]: I0930 17:41:15.899085 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:41:15 crc kubenswrapper[4772]: E0930 17:41:15.899634 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:41:16 crc kubenswrapper[4772]: I0930 17:41:16.094858 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" event={"ID":"489fcf90-05fb-484f-9cd9-6b403023229a","Type":"ContainerStarted","Data":"0f488a55d4015dcaf7a2563a5b02965eaa692e53ace9b130e09312b51cc6c78c"} Sep 30 17:41:16 crc kubenswrapper[4772]: I0930 17:41:16.114683 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" podStartSLOduration=2.665709512 podStartE2EDuration="3.114666123s" podCreationTimestamp="2025-09-30 17:41:13 +0000 UTC" firstStartedPulling="2025-09-30 17:41:14.437090338 +0000 UTC m=+2375.344103169" lastFinishedPulling="2025-09-30 17:41:14.886046949 +0000 UTC m=+2375.793059780" observedRunningTime="2025-09-30 17:41:16.111271483 +0000 UTC m=+2377.018284314" watchObservedRunningTime="2025-09-30 17:41:16.114666123 +0000 UTC m=+2377.021678954" Sep 30 17:41:27 crc kubenswrapper[4772]: I0930 17:41:27.206785 4772 generic.go:334] "Generic (PLEG): container finished" podID="489fcf90-05fb-484f-9cd9-6b403023229a" containerID="0f488a55d4015dcaf7a2563a5b02965eaa692e53ace9b130e09312b51cc6c78c" exitCode=0 Sep 30 17:41:27 crc kubenswrapper[4772]: I0930 17:41:27.206860 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" event={"ID":"489fcf90-05fb-484f-9cd9-6b403023229a","Type":"ContainerDied","Data":"0f488a55d4015dcaf7a2563a5b02965eaa692e53ace9b130e09312b51cc6c78c"} Sep 30 17:41:27 crc kubenswrapper[4772]: I0930 17:41:27.899077 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:41:27 crc kubenswrapper[4772]: E0930 17:41:27.899868 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.693539 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.801458 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ssh-key\") pod \"489fcf90-05fb-484f-9cd9-6b403023229a\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.801542 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-repo-setup-combined-ca-bundle\") pod \"489fcf90-05fb-484f-9cd9-6b403023229a\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.801930 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-inventory\") pod \"489fcf90-05fb-484f-9cd9-6b403023229a\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.802004 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq6zj\" (UniqueName: \"kubernetes.io/projected/489fcf90-05fb-484f-9cd9-6b403023229a-kube-api-access-lq6zj\") pod \"489fcf90-05fb-484f-9cd9-6b403023229a\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.802085 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ceph\") pod \"489fcf90-05fb-484f-9cd9-6b403023229a\" (UID: \"489fcf90-05fb-484f-9cd9-6b403023229a\") " Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.808192 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489fcf90-05fb-484f-9cd9-6b403023229a-kube-api-access-lq6zj" (OuterVolumeSpecName: "kube-api-access-lq6zj") pod "489fcf90-05fb-484f-9cd9-6b403023229a" (UID: "489fcf90-05fb-484f-9cd9-6b403023229a"). InnerVolumeSpecName "kube-api-access-lq6zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.808606 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ceph" (OuterVolumeSpecName: "ceph") pod "489fcf90-05fb-484f-9cd9-6b403023229a" (UID: "489fcf90-05fb-484f-9cd9-6b403023229a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.808785 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "489fcf90-05fb-484f-9cd9-6b403023229a" (UID: "489fcf90-05fb-484f-9cd9-6b403023229a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.832392 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-inventory" (OuterVolumeSpecName: "inventory") pod "489fcf90-05fb-484f-9cd9-6b403023229a" (UID: "489fcf90-05fb-484f-9cd9-6b403023229a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.839999 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "489fcf90-05fb-484f-9cd9-6b403023229a" (UID: "489fcf90-05fb-484f-9cd9-6b403023229a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.905020 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.905102 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq6zj\" (UniqueName: \"kubernetes.io/projected/489fcf90-05fb-484f-9cd9-6b403023229a-kube-api-access-lq6zj\") on node \"crc\" DevicePath \"\"" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.905128 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.905145 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:41:28 crc kubenswrapper[4772]: I0930 17:41:28.905163 4772 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489fcf90-05fb-484f-9cd9-6b403023229a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.230909 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" event={"ID":"489fcf90-05fb-484f-9cd9-6b403023229a","Type":"ContainerDied","Data":"4e591778038648c64cac4c80a3a435a55549ff7527491e5b126f588b5303add3"} Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.230947 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e591778038648c64cac4c80a3a435a55549ff7527491e5b126f588b5303add3" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.230956 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.374130 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h"] Sep 30 17:41:29 crc kubenswrapper[4772]: E0930 17:41:29.374597 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489fcf90-05fb-484f-9cd9-6b403023229a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.374623 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="489fcf90-05fb-484f-9cd9-6b403023229a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.374855 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="489fcf90-05fb-484f-9cd9-6b403023229a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.375714 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.381402 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.381713 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.381945 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.381954 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.382251 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.389482 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h"] Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.516281 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.516351 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.516561 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.516608 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.516738 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmjmb\" (UniqueName: \"kubernetes.io/projected/f953fcc8-8726-4ec2-a493-d67f3f540054-kube-api-access-fmjmb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.618852 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmjmb\" (UniqueName: \"kubernetes.io/projected/f953fcc8-8726-4ec2-a493-d67f3f540054-kube-api-access-fmjmb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.619086 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.619187 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.619419 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.620397 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.623904 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.624183 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.624942 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.630266 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.634828 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmjmb\" (UniqueName: \"kubernetes.io/projected/f953fcc8-8726-4ec2-a493-d67f3f540054-kube-api-access-fmjmb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:29 crc kubenswrapper[4772]: I0930 17:41:29.696852 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:41:30 crc kubenswrapper[4772]: I0930 17:41:30.247723 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h"] Sep 30 17:41:31 crc kubenswrapper[4772]: I0930 17:41:31.250757 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" event={"ID":"f953fcc8-8726-4ec2-a493-d67f3f540054","Type":"ContainerStarted","Data":"900a66e0ff7bfedb56bf814c902de3a89d182fcd6e5825141c13acb9d80ea378"} Sep 30 17:41:31 crc kubenswrapper[4772]: I0930 17:41:31.251133 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" event={"ID":"f953fcc8-8726-4ec2-a493-d67f3f540054","Type":"ContainerStarted","Data":"8cd8d4953e270df68ec400d56c557e10a7f40e2afe0d63eff738f97afa5ef834"} Sep 30 17:41:31 crc kubenswrapper[4772]: I0930 17:41:31.272244 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" podStartSLOduration=1.803611902 podStartE2EDuration="2.272224548s" podCreationTimestamp="2025-09-30 17:41:29 +0000 UTC" firstStartedPulling="2025-09-30 17:41:30.248351501 +0000 UTC m=+2391.155364332" lastFinishedPulling="2025-09-30 17:41:30.716964147 +0000 UTC m=+2391.623976978" observedRunningTime="2025-09-30 17:41:31.266502035 +0000 UTC m=+2392.173514866" watchObservedRunningTime="2025-09-30 17:41:31.272224548 +0000 UTC m=+2392.179237379" Sep 30 17:41:38 crc kubenswrapper[4772]: I0930 17:41:38.898911 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:41:38 crc kubenswrapper[4772]: E0930 17:41:38.899672 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.125233 4772 scope.go:117] "RemoveContainer" containerID="744226b27b19ad7d102f4a6fdeaf2d1472442ffde967e3f04a2c8fdc189a37cc" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.164163 4772 scope.go:117] "RemoveContainer" containerID="02a56a314fbacc630045857f1f3bc8fefc2d3ae54e6874c6cf9603efccd6e32f" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.220465 4772 scope.go:117] "RemoveContainer" containerID="35ddad75ce72179dea54caffc34bc1b944e074b011f24e19e3d0101078893161" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.248544 4772 scope.go:117] "RemoveContainer" containerID="5bbd02d06ae963aa1269f44bbbaf1b8a4046da66809139963d3afe2fa9fbbfbb" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.308409 4772 scope.go:117] "RemoveContainer" containerID="2cc52942941c96f043717c0024add281d310abf2fab02cb7365b1a4a2029d083" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.362227 4772 scope.go:117] "RemoveContainer" containerID="9b6432f3ad15c8e9f97708735999655110f936e734697dcf5b94d4d6cb59a4ae" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.441139 4772 scope.go:117] "RemoveContainer" containerID="9bf5673843f926b2c944482cb626a0dce054fa127abd39c639b4cba04fcab37b" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.473333 4772 scope.go:117] "RemoveContainer" containerID="965d300a81ec275fec1c1370c9a23947799ce1918e980807fff6123cfb2476fb" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.522834 4772 scope.go:117] "RemoveContainer" containerID="70b341f0919502bad9dc4b69c8c26525acfc5146983f554488efdb7041b8a4ec" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.555759 4772 scope.go:117] "RemoveContainer" containerID="2168037135d0ef970a9fc055a874188222affce2f41bb707b5d1eff228a56234" Sep 30 17:41:44 crc kubenswrapper[4772]: I0930 17:41:44.602616 4772 scope.go:117] "RemoveContainer" containerID="aa122e777b22c73b768dd39c8ebe05a682a970f6136f37f70f3b94181a745922" Sep 30 17:41:52 crc kubenswrapper[4772]: I0930 17:41:52.897757 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:41:52 crc kubenswrapper[4772]: E0930 17:41:52.898437 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:42:05 crc kubenswrapper[4772]: I0930 17:42:05.899206 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:42:05 crc kubenswrapper[4772]: E0930 17:42:05.899950 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:42:19 crc kubenswrapper[4772]: I0930 17:42:19.905452 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:42:19 crc kubenswrapper[4772]: E0930 17:42:19.906644 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:42:32 crc kubenswrapper[4772]: I0930 17:42:32.898757 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:42:32 crc kubenswrapper[4772]: E0930 17:42:32.900309 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:42:44 crc kubenswrapper[4772]: I0930 17:42:44.898014 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:42:44 crc kubenswrapper[4772]: E0930 17:42:44.899344 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:42:55 crc kubenswrapper[4772]: I0930 17:42:55.899074 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:42:55 crc kubenswrapper[4772]: E0930 17:42:55.900160 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:43:06 crc kubenswrapper[4772]: I0930 17:43:06.898689 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:43:06 crc kubenswrapper[4772]: E0930 17:43:06.899493 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:43:11 crc kubenswrapper[4772]: I0930 17:43:11.118545 4772 generic.go:334] "Generic (PLEG): container finished" podID="f953fcc8-8726-4ec2-a493-d67f3f540054" containerID="900a66e0ff7bfedb56bf814c902de3a89d182fcd6e5825141c13acb9d80ea378" exitCode=0 Sep 30 17:43:11 crc kubenswrapper[4772]: I0930 17:43:11.118664 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" event={"ID":"f953fcc8-8726-4ec2-a493-d67f3f540054","Type":"ContainerDied","Data":"900a66e0ff7bfedb56bf814c902de3a89d182fcd6e5825141c13acb9d80ea378"} Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.533769 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.592098 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-bootstrap-combined-ca-bundle\") pod \"f953fcc8-8726-4ec2-a493-d67f3f540054\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.592153 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-inventory\") pod \"f953fcc8-8726-4ec2-a493-d67f3f540054\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.592271 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ceph\") pod \"f953fcc8-8726-4ec2-a493-d67f3f540054\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.592304 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ssh-key\") pod \"f953fcc8-8726-4ec2-a493-d67f3f540054\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.592356 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmjmb\" (UniqueName: \"kubernetes.io/projected/f953fcc8-8726-4ec2-a493-d67f3f540054-kube-api-access-fmjmb\") pod \"f953fcc8-8726-4ec2-a493-d67f3f540054\" (UID: \"f953fcc8-8726-4ec2-a493-d67f3f540054\") " Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.598478 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f953fcc8-8726-4ec2-a493-d67f3f540054-kube-api-access-fmjmb" (OuterVolumeSpecName: "kube-api-access-fmjmb") pod "f953fcc8-8726-4ec2-a493-d67f3f540054" (UID: "f953fcc8-8726-4ec2-a493-d67f3f540054"). InnerVolumeSpecName "kube-api-access-fmjmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.598870 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f953fcc8-8726-4ec2-a493-d67f3f540054" (UID: "f953fcc8-8726-4ec2-a493-d67f3f540054"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.601210 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ceph" (OuterVolumeSpecName: "ceph") pod "f953fcc8-8726-4ec2-a493-d67f3f540054" (UID: "f953fcc8-8726-4ec2-a493-d67f3f540054"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.619499 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f953fcc8-8726-4ec2-a493-d67f3f540054" (UID: "f953fcc8-8726-4ec2-a493-d67f3f540054"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.631531 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-inventory" (OuterVolumeSpecName: "inventory") pod "f953fcc8-8726-4ec2-a493-d67f3f540054" (UID: "f953fcc8-8726-4ec2-a493-d67f3f540054"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.694544 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmjmb\" (UniqueName: \"kubernetes.io/projected/f953fcc8-8726-4ec2-a493-d67f3f540054-kube-api-access-fmjmb\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.694592 4772 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.694606 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.694621 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:12 crc kubenswrapper[4772]: I0930 17:43:12.694632 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f953fcc8-8726-4ec2-a493-d67f3f540054-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.138464 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" event={"ID":"f953fcc8-8726-4ec2-a493-d67f3f540054","Type":"ContainerDied","Data":"8cd8d4953e270df68ec400d56c557e10a7f40e2afe0d63eff738f97afa5ef834"} Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.138524 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cd8d4953e270df68ec400d56c557e10a7f40e2afe0d63eff738f97afa5ef834" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.138558 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.238433 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps"] Sep 30 17:43:13 crc kubenswrapper[4772]: E0930 17:43:13.238810 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f953fcc8-8726-4ec2-a493-d67f3f540054" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.238828 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f953fcc8-8726-4ec2-a493-d67f3f540054" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.239009 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f953fcc8-8726-4ec2-a493-d67f3f540054" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.239718 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.242792 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.242856 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.243042 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.243320 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.247945 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps"] Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.251884 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.305516 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w69cq\" (UniqueName: \"kubernetes.io/projected/f9277e5c-9f8e-4c7c-a979-03fce35dab53-kube-api-access-w69cq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.305575 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.305625 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.305766 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.407217 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.407353 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.407442 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w69cq\" (UniqueName: \"kubernetes.io/projected/f9277e5c-9f8e-4c7c-a979-03fce35dab53-kube-api-access-w69cq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.407482 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.411166 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.411192 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.412460 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.424512 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w69cq\" (UniqueName: \"kubernetes.io/projected/f9277e5c-9f8e-4c7c-a979-03fce35dab53-kube-api-access-w69cq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vmjps\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:13 crc kubenswrapper[4772]: I0930 17:43:13.558725 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:14 crc kubenswrapper[4772]: I0930 17:43:14.088317 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps"] Sep 30 17:43:14 crc kubenswrapper[4772]: I0930 17:43:14.147442 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" event={"ID":"f9277e5c-9f8e-4c7c-a979-03fce35dab53","Type":"ContainerStarted","Data":"2faffe05104e9cf24372780da7dd0387d32e5ff89ecc70213d05f0640c04decd"} Sep 30 17:43:15 crc kubenswrapper[4772]: I0930 17:43:15.166331 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" event={"ID":"f9277e5c-9f8e-4c7c-a979-03fce35dab53","Type":"ContainerStarted","Data":"7674255cf8ff3ed385c08887a1988a50cbba468ed219d4dc456b6507cddedf8b"} Sep 30 17:43:15 crc kubenswrapper[4772]: I0930 17:43:15.191610 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" podStartSLOduration=1.7392460189999999 podStartE2EDuration="2.191586673s" podCreationTimestamp="2025-09-30 17:43:13 +0000 UTC" firstStartedPulling="2025-09-30 17:43:14.0951199 +0000 UTC m=+2495.002132731" lastFinishedPulling="2025-09-30 17:43:14.547460514 +0000 UTC m=+2495.454473385" observedRunningTime="2025-09-30 17:43:15.189425676 +0000 UTC m=+2496.096438517" watchObservedRunningTime="2025-09-30 17:43:15.191586673 +0000 UTC m=+2496.098599504" Sep 30 17:43:18 crc kubenswrapper[4772]: I0930 17:43:18.899321 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:43:18 crc kubenswrapper[4772]: E0930 17:43:18.900226 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:43:30 crc kubenswrapper[4772]: I0930 17:43:30.899729 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:43:30 crc kubenswrapper[4772]: E0930 17:43:30.900944 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:43:42 crc kubenswrapper[4772]: I0930 17:43:42.391316 4772 generic.go:334] "Generic (PLEG): container finished" podID="f9277e5c-9f8e-4c7c-a979-03fce35dab53" containerID="7674255cf8ff3ed385c08887a1988a50cbba468ed219d4dc456b6507cddedf8b" exitCode=0 Sep 30 17:43:42 crc kubenswrapper[4772]: I0930 17:43:42.391428 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" event={"ID":"f9277e5c-9f8e-4c7c-a979-03fce35dab53","Type":"ContainerDied","Data":"7674255cf8ff3ed385c08887a1988a50cbba468ed219d4dc456b6507cddedf8b"} Sep 30 17:43:43 crc kubenswrapper[4772]: I0930 17:43:43.865508 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.031519 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-inventory\") pod \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.031842 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ceph\") pod \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.031887 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ssh-key\") pod \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.031963 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w69cq\" (UniqueName: \"kubernetes.io/projected/f9277e5c-9f8e-4c7c-a979-03fce35dab53-kube-api-access-w69cq\") pod \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\" (UID: \"f9277e5c-9f8e-4c7c-a979-03fce35dab53\") " Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.037519 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ceph" (OuterVolumeSpecName: "ceph") pod "f9277e5c-9f8e-4c7c-a979-03fce35dab53" (UID: "f9277e5c-9f8e-4c7c-a979-03fce35dab53"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.038015 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9277e5c-9f8e-4c7c-a979-03fce35dab53-kube-api-access-w69cq" (OuterVolumeSpecName: "kube-api-access-w69cq") pod "f9277e5c-9f8e-4c7c-a979-03fce35dab53" (UID: "f9277e5c-9f8e-4c7c-a979-03fce35dab53"). InnerVolumeSpecName "kube-api-access-w69cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.059938 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f9277e5c-9f8e-4c7c-a979-03fce35dab53" (UID: "f9277e5c-9f8e-4c7c-a979-03fce35dab53"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.062815 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-inventory" (OuterVolumeSpecName: "inventory") pod "f9277e5c-9f8e-4c7c-a979-03fce35dab53" (UID: "f9277e5c-9f8e-4c7c-a979-03fce35dab53"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.134801 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.135151 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.135264 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9277e5c-9f8e-4c7c-a979-03fce35dab53-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.135349 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w69cq\" (UniqueName: \"kubernetes.io/projected/f9277e5c-9f8e-4c7c-a979-03fce35dab53-kube-api-access-w69cq\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.410297 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" event={"ID":"f9277e5c-9f8e-4c7c-a979-03fce35dab53","Type":"ContainerDied","Data":"2faffe05104e9cf24372780da7dd0387d32e5ff89ecc70213d05f0640c04decd"} Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.410360 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2faffe05104e9cf24372780da7dd0387d32e5ff89ecc70213d05f0640c04decd" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.410320 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vmjps" Sep 30 17:43:44 crc kubenswrapper[4772]: E0930 17:43:44.492503 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9277e5c_9f8e_4c7c_a979_03fce35dab53.slice\": RecentStats: unable to find data in memory cache]" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.511931 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj"] Sep 30 17:43:44 crc kubenswrapper[4772]: E0930 17:43:44.512561 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9277e5c-9f8e-4c7c-a979-03fce35dab53" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.512597 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9277e5c-9f8e-4c7c-a979-03fce35dab53" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.512849 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9277e5c-9f8e-4c7c-a979-03fce35dab53" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.513761 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.516444 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.516789 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.520614 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.520855 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.520678 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.526305 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj"] Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.645431 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.645698 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.645838 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.645900 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcw2p\" (UniqueName: \"kubernetes.io/projected/688b1ee3-fe2c-4d2d-917f-17510c9d980a-kube-api-access-tcw2p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.747834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.748400 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.748458 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcw2p\" (UniqueName: \"kubernetes.io/projected/688b1ee3-fe2c-4d2d-917f-17510c9d980a-kube-api-access-tcw2p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.748554 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.752971 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.753226 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.756292 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.768689 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcw2p\" (UniqueName: \"kubernetes.io/projected/688b1ee3-fe2c-4d2d-917f-17510c9d980a-kube-api-access-tcw2p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7szsj\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.836244 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:44 crc kubenswrapper[4772]: I0930 17:43:44.902049 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:43:44 crc kubenswrapper[4772]: E0930 17:43:44.902446 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:43:45 crc kubenswrapper[4772]: I0930 17:43:45.377824 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj"] Sep 30 17:43:45 crc kubenswrapper[4772]: I0930 17:43:45.419688 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" event={"ID":"688b1ee3-fe2c-4d2d-917f-17510c9d980a","Type":"ContainerStarted","Data":"215ec57d6af101c49f41875ebfe8c32f0cacbd5aa2cf7713389220805f8b1e83"} Sep 30 17:43:46 crc kubenswrapper[4772]: I0930 17:43:46.430018 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" event={"ID":"688b1ee3-fe2c-4d2d-917f-17510c9d980a","Type":"ContainerStarted","Data":"3170c6dcb8a47d762985f63222b9bf4b7bdc02e0862240e6329b40c284cacbd6"} Sep 30 17:43:46 crc kubenswrapper[4772]: I0930 17:43:46.457222 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" podStartSLOduration=2.060849936 podStartE2EDuration="2.457203211s" podCreationTimestamp="2025-09-30 17:43:44 +0000 UTC" firstStartedPulling="2025-09-30 17:43:45.384796289 +0000 UTC m=+2526.291809120" lastFinishedPulling="2025-09-30 17:43:45.781149574 +0000 UTC m=+2526.688162395" observedRunningTime="2025-09-30 17:43:46.445692579 +0000 UTC m=+2527.352705420" watchObservedRunningTime="2025-09-30 17:43:46.457203211 +0000 UTC m=+2527.364216042" Sep 30 17:43:51 crc kubenswrapper[4772]: I0930 17:43:51.474153 4772 generic.go:334] "Generic (PLEG): container finished" podID="688b1ee3-fe2c-4d2d-917f-17510c9d980a" containerID="3170c6dcb8a47d762985f63222b9bf4b7bdc02e0862240e6329b40c284cacbd6" exitCode=0 Sep 30 17:43:51 crc kubenswrapper[4772]: I0930 17:43:51.474279 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" event={"ID":"688b1ee3-fe2c-4d2d-917f-17510c9d980a","Type":"ContainerDied","Data":"3170c6dcb8a47d762985f63222b9bf4b7bdc02e0862240e6329b40c284cacbd6"} Sep 30 17:43:52 crc kubenswrapper[4772]: I0930 17:43:52.859763 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.035248 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ssh-key\") pod \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.035373 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-inventory\") pod \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.035438 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ceph\") pod \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.035479 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcw2p\" (UniqueName: \"kubernetes.io/projected/688b1ee3-fe2c-4d2d-917f-17510c9d980a-kube-api-access-tcw2p\") pod \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\" (UID: \"688b1ee3-fe2c-4d2d-917f-17510c9d980a\") " Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.044558 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ceph" (OuterVolumeSpecName: "ceph") pod "688b1ee3-fe2c-4d2d-917f-17510c9d980a" (UID: "688b1ee3-fe2c-4d2d-917f-17510c9d980a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.044805 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688b1ee3-fe2c-4d2d-917f-17510c9d980a-kube-api-access-tcw2p" (OuterVolumeSpecName: "kube-api-access-tcw2p") pod "688b1ee3-fe2c-4d2d-917f-17510c9d980a" (UID: "688b1ee3-fe2c-4d2d-917f-17510c9d980a"). InnerVolumeSpecName "kube-api-access-tcw2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.062076 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "688b1ee3-fe2c-4d2d-917f-17510c9d980a" (UID: "688b1ee3-fe2c-4d2d-917f-17510c9d980a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.063982 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-inventory" (OuterVolumeSpecName: "inventory") pod "688b1ee3-fe2c-4d2d-917f-17510c9d980a" (UID: "688b1ee3-fe2c-4d2d-917f-17510c9d980a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.138603 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.138709 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.138721 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/688b1ee3-fe2c-4d2d-917f-17510c9d980a-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.138730 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcw2p\" (UniqueName: \"kubernetes.io/projected/688b1ee3-fe2c-4d2d-917f-17510c9d980a-kube-api-access-tcw2p\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.533634 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" event={"ID":"688b1ee3-fe2c-4d2d-917f-17510c9d980a","Type":"ContainerDied","Data":"215ec57d6af101c49f41875ebfe8c32f0cacbd5aa2cf7713389220805f8b1e83"} Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.533693 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="215ec57d6af101c49f41875ebfe8c32f0cacbd5aa2cf7713389220805f8b1e83" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.533728 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7szsj" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.568792 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh"] Sep 30 17:43:53 crc kubenswrapper[4772]: E0930 17:43:53.569440 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688b1ee3-fe2c-4d2d-917f-17510c9d980a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.569504 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="688b1ee3-fe2c-4d2d-917f-17510c9d980a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.569830 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="688b1ee3-fe2c-4d2d-917f-17510c9d980a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.570820 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.573277 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.573621 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.573749 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.574459 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.574706 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.579905 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh"] Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.749365 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbhb\" (UniqueName: \"kubernetes.io/projected/71289a51-de10-4dea-8aca-4a3cbd177e65-kube-api-access-zxbhb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.749451 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.749529 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.749573 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.851916 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.852456 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.852553 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.852714 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbhb\" (UniqueName: \"kubernetes.io/projected/71289a51-de10-4dea-8aca-4a3cbd177e65-kube-api-access-zxbhb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.859702 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.860075 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.860443 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.873565 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbhb\" (UniqueName: \"kubernetes.io/projected/71289a51-de10-4dea-8aca-4a3cbd177e65-kube-api-access-zxbhb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nswzh\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:53 crc kubenswrapper[4772]: I0930 17:43:53.898198 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:43:54 crc kubenswrapper[4772]: I0930 17:43:54.448829 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh"] Sep 30 17:43:54 crc kubenswrapper[4772]: I0930 17:43:54.542850 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" event={"ID":"71289a51-de10-4dea-8aca-4a3cbd177e65","Type":"ContainerStarted","Data":"597ba7b6a4b5687a699c20ed2abd5a6ba5f8a8629eb680ff5796d1d8d3ac14c9"} Sep 30 17:43:55 crc kubenswrapper[4772]: I0930 17:43:55.552908 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" event={"ID":"71289a51-de10-4dea-8aca-4a3cbd177e65","Type":"ContainerStarted","Data":"7d0f3ac26f961937eea4928971b66fc67ed6783db4483ef1a8ef43e6ca6c0356"} Sep 30 17:43:55 crc kubenswrapper[4772]: I0930 17:43:55.576090 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" podStartSLOduration=2.086133177 podStartE2EDuration="2.576038507s" podCreationTimestamp="2025-09-30 17:43:53 +0000 UTC" firstStartedPulling="2025-09-30 17:43:54.451399135 +0000 UTC m=+2535.358411966" lastFinishedPulling="2025-09-30 17:43:54.941304455 +0000 UTC m=+2535.848317296" observedRunningTime="2025-09-30 17:43:55.57083783 +0000 UTC m=+2536.477850671" watchObservedRunningTime="2025-09-30 17:43:55.576038507 +0000 UTC m=+2536.483051338" Sep 30 17:43:56 crc kubenswrapper[4772]: I0930 17:43:56.898165 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:43:56 crc kubenswrapper[4772]: E0930 17:43:56.898432 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:44:11 crc kubenswrapper[4772]: I0930 17:44:11.898695 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:44:11 crc kubenswrapper[4772]: E0930 17:44:11.902576 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:44:25 crc kubenswrapper[4772]: I0930 17:44:25.898739 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:44:25 crc kubenswrapper[4772]: E0930 17:44:25.899661 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:44:32 crc kubenswrapper[4772]: I0930 17:44:32.868091 4772 generic.go:334] "Generic (PLEG): container finished" podID="71289a51-de10-4dea-8aca-4a3cbd177e65" containerID="7d0f3ac26f961937eea4928971b66fc67ed6783db4483ef1a8ef43e6ca6c0356" exitCode=0 Sep 30 17:44:32 crc kubenswrapper[4772]: I0930 17:44:32.868228 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" event={"ID":"71289a51-de10-4dea-8aca-4a3cbd177e65","Type":"ContainerDied","Data":"7d0f3ac26f961937eea4928971b66fc67ed6783db4483ef1a8ef43e6ca6c0356"} Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.296211 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.348797 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-inventory\") pod \"71289a51-de10-4dea-8aca-4a3cbd177e65\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.348946 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxbhb\" (UniqueName: \"kubernetes.io/projected/71289a51-de10-4dea-8aca-4a3cbd177e65-kube-api-access-zxbhb\") pod \"71289a51-de10-4dea-8aca-4a3cbd177e65\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.348976 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ssh-key\") pod \"71289a51-de10-4dea-8aca-4a3cbd177e65\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.349802 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ceph\") pod \"71289a51-de10-4dea-8aca-4a3cbd177e65\" (UID: \"71289a51-de10-4dea-8aca-4a3cbd177e65\") " Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.356219 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ceph" (OuterVolumeSpecName: "ceph") pod "71289a51-de10-4dea-8aca-4a3cbd177e65" (UID: "71289a51-de10-4dea-8aca-4a3cbd177e65"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.356418 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71289a51-de10-4dea-8aca-4a3cbd177e65-kube-api-access-zxbhb" (OuterVolumeSpecName: "kube-api-access-zxbhb") pod "71289a51-de10-4dea-8aca-4a3cbd177e65" (UID: "71289a51-de10-4dea-8aca-4a3cbd177e65"). InnerVolumeSpecName "kube-api-access-zxbhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.380572 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-inventory" (OuterVolumeSpecName: "inventory") pod "71289a51-de10-4dea-8aca-4a3cbd177e65" (UID: "71289a51-de10-4dea-8aca-4a3cbd177e65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.391185 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "71289a51-de10-4dea-8aca-4a3cbd177e65" (UID: "71289a51-de10-4dea-8aca-4a3cbd177e65"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.452364 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.452417 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxbhb\" (UniqueName: \"kubernetes.io/projected/71289a51-de10-4dea-8aca-4a3cbd177e65-kube-api-access-zxbhb\") on node \"crc\" DevicePath \"\"" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.452432 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.452443 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71289a51-de10-4dea-8aca-4a3cbd177e65-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.885673 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" event={"ID":"71289a51-de10-4dea-8aca-4a3cbd177e65","Type":"ContainerDied","Data":"597ba7b6a4b5687a699c20ed2abd5a6ba5f8a8629eb680ff5796d1d8d3ac14c9"} Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.885941 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="597ba7b6a4b5687a699c20ed2abd5a6ba5f8a8629eb680ff5796d1d8d3ac14c9" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.885731 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nswzh" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.973621 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j"] Sep 30 17:44:34 crc kubenswrapper[4772]: E0930 17:44:34.974499 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71289a51-de10-4dea-8aca-4a3cbd177e65" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.974523 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="71289a51-de10-4dea-8aca-4a3cbd177e65" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.974757 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="71289a51-de10-4dea-8aca-4a3cbd177e65" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.976014 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.978825 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.978966 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.979196 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.979403 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.979553 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:44:34 crc kubenswrapper[4772]: I0930 17:44:34.983574 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j"] Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.066013 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnhnc\" (UniqueName: \"kubernetes.io/projected/d001e435-b677-46e3-a31b-f5d1ae7e5c01-kube-api-access-wnhnc\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.066153 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.066180 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.066203 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.167961 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.168009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.168034 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.168150 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnhnc\" (UniqueName: \"kubernetes.io/projected/d001e435-b677-46e3-a31b-f5d1ae7e5c01-kube-api-access-wnhnc\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.178667 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.178741 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.178967 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.185788 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnhnc\" (UniqueName: \"kubernetes.io/projected/d001e435-b677-46e3-a31b-f5d1ae7e5c01-kube-api-access-wnhnc\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.298940 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.803726 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j"] Sep 30 17:44:35 crc kubenswrapper[4772]: I0930 17:44:35.896100 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" event={"ID":"d001e435-b677-46e3-a31b-f5d1ae7e5c01","Type":"ContainerStarted","Data":"922a8be25e10eba60c58a3a720b90ae4074c0c5b450ed4a6d012a15f63753867"} Sep 30 17:44:36 crc kubenswrapper[4772]: I0930 17:44:36.905416 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" event={"ID":"d001e435-b677-46e3-a31b-f5d1ae7e5c01","Type":"ContainerStarted","Data":"920027b09edfaed3d8bd414c7a6eec8c147d40017b3c50cad3fabd23e379e434"} Sep 30 17:44:36 crc kubenswrapper[4772]: I0930 17:44:36.924748 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" podStartSLOduration=2.489225674 podStartE2EDuration="2.924720816s" podCreationTimestamp="2025-09-30 17:44:34 +0000 UTC" firstStartedPulling="2025-09-30 17:44:35.813383112 +0000 UTC m=+2576.720395943" lastFinishedPulling="2025-09-30 17:44:36.248878254 +0000 UTC m=+2577.155891085" observedRunningTime="2025-09-30 17:44:36.920617348 +0000 UTC m=+2577.827630189" watchObservedRunningTime="2025-09-30 17:44:36.924720816 +0000 UTC m=+2577.831733647" Sep 30 17:44:39 crc kubenswrapper[4772]: I0930 17:44:39.908759 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:44:39 crc kubenswrapper[4772]: E0930 17:44:39.911466 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:44:40 crc kubenswrapper[4772]: I0930 17:44:40.947700 4772 generic.go:334] "Generic (PLEG): container finished" podID="d001e435-b677-46e3-a31b-f5d1ae7e5c01" containerID="920027b09edfaed3d8bd414c7a6eec8c147d40017b3c50cad3fabd23e379e434" exitCode=0 Sep 30 17:44:40 crc kubenswrapper[4772]: I0930 17:44:40.947748 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" event={"ID":"d001e435-b677-46e3-a31b-f5d1ae7e5c01","Type":"ContainerDied","Data":"920027b09edfaed3d8bd414c7a6eec8c147d40017b3c50cad3fabd23e379e434"} Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.379198 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.415232 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnhnc\" (UniqueName: \"kubernetes.io/projected/d001e435-b677-46e3-a31b-f5d1ae7e5c01-kube-api-access-wnhnc\") pod \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.415362 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ssh-key\") pod \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.415478 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-inventory\") pod \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.415576 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ceph\") pod \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\" (UID: \"d001e435-b677-46e3-a31b-f5d1ae7e5c01\") " Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.420367 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d001e435-b677-46e3-a31b-f5d1ae7e5c01-kube-api-access-wnhnc" (OuterVolumeSpecName: "kube-api-access-wnhnc") pod "d001e435-b677-46e3-a31b-f5d1ae7e5c01" (UID: "d001e435-b677-46e3-a31b-f5d1ae7e5c01"). InnerVolumeSpecName "kube-api-access-wnhnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.420911 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ceph" (OuterVolumeSpecName: "ceph") pod "d001e435-b677-46e3-a31b-f5d1ae7e5c01" (UID: "d001e435-b677-46e3-a31b-f5d1ae7e5c01"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.441365 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-inventory" (OuterVolumeSpecName: "inventory") pod "d001e435-b677-46e3-a31b-f5d1ae7e5c01" (UID: "d001e435-b677-46e3-a31b-f5d1ae7e5c01"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.449658 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d001e435-b677-46e3-a31b-f5d1ae7e5c01" (UID: "d001e435-b677-46e3-a31b-f5d1ae7e5c01"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.518004 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.518035 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.518046 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d001e435-b677-46e3-a31b-f5d1ae7e5c01-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.518067 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnhnc\" (UniqueName: \"kubernetes.io/projected/d001e435-b677-46e3-a31b-f5d1ae7e5c01-kube-api-access-wnhnc\") on node \"crc\" DevicePath \"\"" Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.968766 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" event={"ID":"d001e435-b677-46e3-a31b-f5d1ae7e5c01","Type":"ContainerDied","Data":"922a8be25e10eba60c58a3a720b90ae4074c0c5b450ed4a6d012a15f63753867"} Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.968820 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="922a8be25e10eba60c58a3a720b90ae4074c0c5b450ed4a6d012a15f63753867" Sep 30 17:44:42 crc kubenswrapper[4772]: I0930 17:44:42.969337 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.047830 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx"] Sep 30 17:44:43 crc kubenswrapper[4772]: E0930 17:44:43.048424 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d001e435-b677-46e3-a31b-f5d1ae7e5c01" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.048471 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d001e435-b677-46e3-a31b-f5d1ae7e5c01" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.048730 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d001e435-b677-46e3-a31b-f5d1ae7e5c01" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.049968 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.052961 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.052992 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.053140 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.054640 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.060497 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.079155 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx"] Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.133611 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n7lc\" (UniqueName: \"kubernetes.io/projected/104de20c-fde6-42d5-aa8b-f23445a3661e-kube-api-access-9n7lc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.133714 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.134102 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.134378 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.236559 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.236944 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.237028 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n7lc\" (UniqueName: \"kubernetes.io/projected/104de20c-fde6-42d5-aa8b-f23445a3661e-kube-api-access-9n7lc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.237093 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.240120 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.240388 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.245513 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.252430 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n7lc\" (UniqueName: \"kubernetes.io/projected/104de20c-fde6-42d5-aa8b-f23445a3661e-kube-api-access-9n7lc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.379249 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.875947 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx"] Sep 30 17:44:43 crc kubenswrapper[4772]: I0930 17:44:43.978582 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" event={"ID":"104de20c-fde6-42d5-aa8b-f23445a3661e","Type":"ContainerStarted","Data":"4653904b0ca7d71de6de85f20ac8cb63d624af6415d159b424346debddc7d844"} Sep 30 17:44:44 crc kubenswrapper[4772]: I0930 17:44:44.989467 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" event={"ID":"104de20c-fde6-42d5-aa8b-f23445a3661e","Type":"ContainerStarted","Data":"b437cc133fd54050e8628e7ac4699ce2ff61ab571ed422b68629638060154286"} Sep 30 17:44:45 crc kubenswrapper[4772]: I0930 17:44:45.011423 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" podStartSLOduration=1.565277306 podStartE2EDuration="2.011395987s" podCreationTimestamp="2025-09-30 17:44:43 +0000 UTC" firstStartedPulling="2025-09-30 17:44:43.884129286 +0000 UTC m=+2584.791142117" lastFinishedPulling="2025-09-30 17:44:44.330247967 +0000 UTC m=+2585.237260798" observedRunningTime="2025-09-30 17:44:45.006015066 +0000 UTC m=+2585.913027917" watchObservedRunningTime="2025-09-30 17:44:45.011395987 +0000 UTC m=+2585.918408818" Sep 30 17:44:51 crc kubenswrapper[4772]: I0930 17:44:51.897956 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:44:51 crc kubenswrapper[4772]: E0930 17:44:51.898632 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.148815 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8"] Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.151208 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.153817 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.154848 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.156944 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8"] Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.254445 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbts8\" (UniqueName: \"kubernetes.io/projected/801a662c-6452-49ea-962b-2ebfbc394f8f-kube-api-access-pbts8\") pod \"collect-profiles-29320905-tscv8\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.254536 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/801a662c-6452-49ea-962b-2ebfbc394f8f-secret-volume\") pod \"collect-profiles-29320905-tscv8\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.254913 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/801a662c-6452-49ea-962b-2ebfbc394f8f-config-volume\") pod \"collect-profiles-29320905-tscv8\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.356434 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/801a662c-6452-49ea-962b-2ebfbc394f8f-config-volume\") pod \"collect-profiles-29320905-tscv8\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.357515 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/801a662c-6452-49ea-962b-2ebfbc394f8f-config-volume\") pod \"collect-profiles-29320905-tscv8\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.357676 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbts8\" (UniqueName: \"kubernetes.io/projected/801a662c-6452-49ea-962b-2ebfbc394f8f-kube-api-access-pbts8\") pod \"collect-profiles-29320905-tscv8\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.357838 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/801a662c-6452-49ea-962b-2ebfbc394f8f-secret-volume\") pod \"collect-profiles-29320905-tscv8\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.374875 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbts8\" (UniqueName: \"kubernetes.io/projected/801a662c-6452-49ea-962b-2ebfbc394f8f-kube-api-access-pbts8\") pod \"collect-profiles-29320905-tscv8\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.377675 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/801a662c-6452-49ea-962b-2ebfbc394f8f-secret-volume\") pod \"collect-profiles-29320905-tscv8\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.512570 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:00 crc kubenswrapper[4772]: I0930 17:45:00.947587 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8"] Sep 30 17:45:01 crc kubenswrapper[4772]: I0930 17:45:01.124150 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" event={"ID":"801a662c-6452-49ea-962b-2ebfbc394f8f","Type":"ContainerStarted","Data":"87bd3344f388ccf979ed29cb9190ed0f3537a28fcf44204d2e249d94fd22a596"} Sep 30 17:45:02 crc kubenswrapper[4772]: I0930 17:45:02.131824 4772 generic.go:334] "Generic (PLEG): container finished" podID="801a662c-6452-49ea-962b-2ebfbc394f8f" containerID="edce61bdabf2cb239d68e624ddc29d5d72fa46c48888ab5629b305f935b27c04" exitCode=0 Sep 30 17:45:02 crc kubenswrapper[4772]: I0930 17:45:02.131864 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" event={"ID":"801a662c-6452-49ea-962b-2ebfbc394f8f","Type":"ContainerDied","Data":"edce61bdabf2cb239d68e624ddc29d5d72fa46c48888ab5629b305f935b27c04"} Sep 30 17:45:03 crc kubenswrapper[4772]: I0930 17:45:03.458309 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:03 crc kubenswrapper[4772]: I0930 17:45:03.617627 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbts8\" (UniqueName: \"kubernetes.io/projected/801a662c-6452-49ea-962b-2ebfbc394f8f-kube-api-access-pbts8\") pod \"801a662c-6452-49ea-962b-2ebfbc394f8f\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " Sep 30 17:45:03 crc kubenswrapper[4772]: I0930 17:45:03.617883 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/801a662c-6452-49ea-962b-2ebfbc394f8f-secret-volume\") pod \"801a662c-6452-49ea-962b-2ebfbc394f8f\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " Sep 30 17:45:03 crc kubenswrapper[4772]: I0930 17:45:03.617944 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/801a662c-6452-49ea-962b-2ebfbc394f8f-config-volume\") pod \"801a662c-6452-49ea-962b-2ebfbc394f8f\" (UID: \"801a662c-6452-49ea-962b-2ebfbc394f8f\") " Sep 30 17:45:03 crc kubenswrapper[4772]: I0930 17:45:03.618905 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801a662c-6452-49ea-962b-2ebfbc394f8f-config-volume" (OuterVolumeSpecName: "config-volume") pod "801a662c-6452-49ea-962b-2ebfbc394f8f" (UID: "801a662c-6452-49ea-962b-2ebfbc394f8f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:45:03 crc kubenswrapper[4772]: I0930 17:45:03.647213 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801a662c-6452-49ea-962b-2ebfbc394f8f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "801a662c-6452-49ea-962b-2ebfbc394f8f" (UID: "801a662c-6452-49ea-962b-2ebfbc394f8f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:45:03 crc kubenswrapper[4772]: I0930 17:45:03.647491 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801a662c-6452-49ea-962b-2ebfbc394f8f-kube-api-access-pbts8" (OuterVolumeSpecName: "kube-api-access-pbts8") pod "801a662c-6452-49ea-962b-2ebfbc394f8f" (UID: "801a662c-6452-49ea-962b-2ebfbc394f8f"). InnerVolumeSpecName "kube-api-access-pbts8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:45:03 crc kubenswrapper[4772]: I0930 17:45:03.720644 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbts8\" (UniqueName: \"kubernetes.io/projected/801a662c-6452-49ea-962b-2ebfbc394f8f-kube-api-access-pbts8\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:03 crc kubenswrapper[4772]: I0930 17:45:03.720956 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/801a662c-6452-49ea-962b-2ebfbc394f8f-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:03 crc kubenswrapper[4772]: I0930 17:45:03.721038 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/801a662c-6452-49ea-962b-2ebfbc394f8f-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:04 crc kubenswrapper[4772]: I0930 17:45:04.151864 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" event={"ID":"801a662c-6452-49ea-962b-2ebfbc394f8f","Type":"ContainerDied","Data":"87bd3344f388ccf979ed29cb9190ed0f3537a28fcf44204d2e249d94fd22a596"} Sep 30 17:45:04 crc kubenswrapper[4772]: I0930 17:45:04.151909 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87bd3344f388ccf979ed29cb9190ed0f3537a28fcf44204d2e249d94fd22a596" Sep 30 17:45:04 crc kubenswrapper[4772]: I0930 17:45:04.151974 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8" Sep 30 17:45:04 crc kubenswrapper[4772]: I0930 17:45:04.526655 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc"] Sep 30 17:45:04 crc kubenswrapper[4772]: I0930 17:45:04.533168 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320860-hlnzc"] Sep 30 17:45:04 crc kubenswrapper[4772]: I0930 17:45:04.898484 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:45:04 crc kubenswrapper[4772]: E0930 17:45:04.899390 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:45:05 crc kubenswrapper[4772]: I0930 17:45:05.918644 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a68e96d-d547-4060-8ab8-c693324a4423" path="/var/lib/kubelet/pods/8a68e96d-d547-4060-8ab8-c693324a4423/volumes" Sep 30 17:45:18 crc kubenswrapper[4772]: I0930 17:45:18.898392 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:45:19 crc kubenswrapper[4772]: I0930 17:45:19.284327 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"8046870d891ddfdf411a3278e2dc77f693a1e3a362e02818c60160e1e36f3da0"} Sep 30 17:45:35 crc kubenswrapper[4772]: I0930 17:45:35.423235 4772 generic.go:334] "Generic (PLEG): container finished" podID="104de20c-fde6-42d5-aa8b-f23445a3661e" containerID="b437cc133fd54050e8628e7ac4699ce2ff61ab571ed422b68629638060154286" exitCode=0 Sep 30 17:45:35 crc kubenswrapper[4772]: I0930 17:45:35.423317 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" event={"ID":"104de20c-fde6-42d5-aa8b-f23445a3661e","Type":"ContainerDied","Data":"b437cc133fd54050e8628e7ac4699ce2ff61ab571ed422b68629638060154286"} Sep 30 17:45:36 crc kubenswrapper[4772]: I0930 17:45:36.905838 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:45:36 crc kubenswrapper[4772]: I0930 17:45:36.961956 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ceph\") pod \"104de20c-fde6-42d5-aa8b-f23445a3661e\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " Sep 30 17:45:36 crc kubenswrapper[4772]: I0930 17:45:36.962485 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ssh-key\") pod \"104de20c-fde6-42d5-aa8b-f23445a3661e\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " Sep 30 17:45:36 crc kubenswrapper[4772]: I0930 17:45:36.962551 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n7lc\" (UniqueName: \"kubernetes.io/projected/104de20c-fde6-42d5-aa8b-f23445a3661e-kube-api-access-9n7lc\") pod \"104de20c-fde6-42d5-aa8b-f23445a3661e\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " Sep 30 17:45:36 crc kubenswrapper[4772]: I0930 17:45:36.962607 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-inventory\") pod \"104de20c-fde6-42d5-aa8b-f23445a3661e\" (UID: \"104de20c-fde6-42d5-aa8b-f23445a3661e\") " Sep 30 17:45:36 crc kubenswrapper[4772]: I0930 17:45:36.974821 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ceph" (OuterVolumeSpecName: "ceph") pod "104de20c-fde6-42d5-aa8b-f23445a3661e" (UID: "104de20c-fde6-42d5-aa8b-f23445a3661e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:45:36 crc kubenswrapper[4772]: I0930 17:45:36.985572 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/104de20c-fde6-42d5-aa8b-f23445a3661e-kube-api-access-9n7lc" (OuterVolumeSpecName: "kube-api-access-9n7lc") pod "104de20c-fde6-42d5-aa8b-f23445a3661e" (UID: "104de20c-fde6-42d5-aa8b-f23445a3661e"). InnerVolumeSpecName "kube-api-access-9n7lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.002690 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "104de20c-fde6-42d5-aa8b-f23445a3661e" (UID: "104de20c-fde6-42d5-aa8b-f23445a3661e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.005473 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-inventory" (OuterVolumeSpecName: "inventory") pod "104de20c-fde6-42d5-aa8b-f23445a3661e" (UID: "104de20c-fde6-42d5-aa8b-f23445a3661e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.076683 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.076716 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.076726 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n7lc\" (UniqueName: \"kubernetes.io/projected/104de20c-fde6-42d5-aa8b-f23445a3661e-kube-api-access-9n7lc\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.076736 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/104de20c-fde6-42d5-aa8b-f23445a3661e-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.444848 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" event={"ID":"104de20c-fde6-42d5-aa8b-f23445a3661e","Type":"ContainerDied","Data":"4653904b0ca7d71de6de85f20ac8cb63d624af6415d159b424346debddc7d844"} Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.445190 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4653904b0ca7d71de6de85f20ac8cb63d624af6415d159b424346debddc7d844" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.444913 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.536237 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vlcvm"] Sep 30 17:45:37 crc kubenswrapper[4772]: E0930 17:45:37.536700 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104de20c-fde6-42d5-aa8b-f23445a3661e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.536725 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="104de20c-fde6-42d5-aa8b-f23445a3661e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:45:37 crc kubenswrapper[4772]: E0930 17:45:37.536795 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801a662c-6452-49ea-962b-2ebfbc394f8f" containerName="collect-profiles" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.536805 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="801a662c-6452-49ea-962b-2ebfbc394f8f" containerName="collect-profiles" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.537048 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="801a662c-6452-49ea-962b-2ebfbc394f8f" containerName="collect-profiles" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.537097 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="104de20c-fde6-42d5-aa8b-f23445a3661e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.537887 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.541932 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.542090 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.542214 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.542265 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.543798 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.557386 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vlcvm"] Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.689343 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.689417 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.689444 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ceph\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.689521 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5s2\" (UniqueName: \"kubernetes.io/projected/525c2cee-edb7-4953-b0c9-6f08b4496be5-kube-api-access-8w5s2\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.791045 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.791118 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ceph\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.791177 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5s2\" (UniqueName: \"kubernetes.io/projected/525c2cee-edb7-4953-b0c9-6f08b4496be5-kube-api-access-8w5s2\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.791288 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.796953 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.797263 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ceph\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.798478 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.809170 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5s2\" (UniqueName: \"kubernetes.io/projected/525c2cee-edb7-4953-b0c9-6f08b4496be5-kube-api-access-8w5s2\") pod \"ssh-known-hosts-edpm-deployment-vlcvm\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:37 crc kubenswrapper[4772]: I0930 17:45:37.859144 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:38 crc kubenswrapper[4772]: I0930 17:45:38.512941 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vlcvm"] Sep 30 17:45:38 crc kubenswrapper[4772]: I0930 17:45:38.525877 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:45:39 crc kubenswrapper[4772]: I0930 17:45:39.460974 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" event={"ID":"525c2cee-edb7-4953-b0c9-6f08b4496be5","Type":"ContainerStarted","Data":"68d751b4dc9f443c448168afd11737dc83544a30767233736b629b3e40be15f3"} Sep 30 17:45:39 crc kubenswrapper[4772]: I0930 17:45:39.461451 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" event={"ID":"525c2cee-edb7-4953-b0c9-6f08b4496be5","Type":"ContainerStarted","Data":"bfc8da31d4044f2d03aca76ac9b5c40e62dea7094183eda487d014de253f32da"} Sep 30 17:45:39 crc kubenswrapper[4772]: I0930 17:45:39.480813 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" podStartSLOduration=1.973296531 podStartE2EDuration="2.480790273s" podCreationTimestamp="2025-09-30 17:45:37 +0000 UTC" firstStartedPulling="2025-09-30 17:45:38.52563403 +0000 UTC m=+2639.432646861" lastFinishedPulling="2025-09-30 17:45:39.033127762 +0000 UTC m=+2639.940140603" observedRunningTime="2025-09-30 17:45:39.473590734 +0000 UTC m=+2640.380603575" watchObservedRunningTime="2025-09-30 17:45:39.480790273 +0000 UTC m=+2640.387803104" Sep 30 17:45:44 crc kubenswrapper[4772]: I0930 17:45:44.900998 4772 scope.go:117] "RemoveContainer" containerID="a886012b36a37b10f41a89e3bba4b398fcbb3fdddfbce03fa2123cc602c5f8b3" Sep 30 17:45:49 crc kubenswrapper[4772]: I0930 17:45:49.545483 4772 generic.go:334] "Generic (PLEG): container finished" podID="525c2cee-edb7-4953-b0c9-6f08b4496be5" containerID="68d751b4dc9f443c448168afd11737dc83544a30767233736b629b3e40be15f3" exitCode=0 Sep 30 17:45:49 crc kubenswrapper[4772]: I0930 17:45:49.545596 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" event={"ID":"525c2cee-edb7-4953-b0c9-6f08b4496be5","Type":"ContainerDied","Data":"68d751b4dc9f443c448168afd11737dc83544a30767233736b629b3e40be15f3"} Sep 30 17:45:50 crc kubenswrapper[4772]: I0930 17:45:50.949713 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.035598 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w5s2\" (UniqueName: \"kubernetes.io/projected/525c2cee-edb7-4953-b0c9-6f08b4496be5-kube-api-access-8w5s2\") pod \"525c2cee-edb7-4953-b0c9-6f08b4496be5\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.035729 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ssh-key-openstack-edpm-ipam\") pod \"525c2cee-edb7-4953-b0c9-6f08b4496be5\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.035902 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-inventory-0\") pod \"525c2cee-edb7-4953-b0c9-6f08b4496be5\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.035958 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ceph\") pod \"525c2cee-edb7-4953-b0c9-6f08b4496be5\" (UID: \"525c2cee-edb7-4953-b0c9-6f08b4496be5\") " Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.045382 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ceph" (OuterVolumeSpecName: "ceph") pod "525c2cee-edb7-4953-b0c9-6f08b4496be5" (UID: "525c2cee-edb7-4953-b0c9-6f08b4496be5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.046192 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525c2cee-edb7-4953-b0c9-6f08b4496be5-kube-api-access-8w5s2" (OuterVolumeSpecName: "kube-api-access-8w5s2") pod "525c2cee-edb7-4953-b0c9-6f08b4496be5" (UID: "525c2cee-edb7-4953-b0c9-6f08b4496be5"). InnerVolumeSpecName "kube-api-access-8w5s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.065443 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "525c2cee-edb7-4953-b0c9-6f08b4496be5" (UID: "525c2cee-edb7-4953-b0c9-6f08b4496be5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.067203 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "525c2cee-edb7-4953-b0c9-6f08b4496be5" (UID: "525c2cee-edb7-4953-b0c9-6f08b4496be5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.138281 4772 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.138319 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.138332 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w5s2\" (UniqueName: \"kubernetes.io/projected/525c2cee-edb7-4953-b0c9-6f08b4496be5-kube-api-access-8w5s2\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.138346 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/525c2cee-edb7-4953-b0c9-6f08b4496be5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.562709 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" event={"ID":"525c2cee-edb7-4953-b0c9-6f08b4496be5","Type":"ContainerDied","Data":"bfc8da31d4044f2d03aca76ac9b5c40e62dea7094183eda487d014de253f32da"} Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.562752 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfc8da31d4044f2d03aca76ac9b5c40e62dea7094183eda487d014de253f32da" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.563118 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vlcvm" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.636664 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j"] Sep 30 17:45:51 crc kubenswrapper[4772]: E0930 17:45:51.637228 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525c2cee-edb7-4953-b0c9-6f08b4496be5" containerName="ssh-known-hosts-edpm-deployment" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.637251 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="525c2cee-edb7-4953-b0c9-6f08b4496be5" containerName="ssh-known-hosts-edpm-deployment" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.637511 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="525c2cee-edb7-4953-b0c9-6f08b4496be5" containerName="ssh-known-hosts-edpm-deployment" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.638356 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.640556 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.640802 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.641145 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.641378 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.641556 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.646777 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j"] Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.698368 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.698510 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.698540 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrwgb\" (UniqueName: \"kubernetes.io/projected/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-kube-api-access-nrwgb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.698576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.801605 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.802274 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.802308 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrwgb\" (UniqueName: \"kubernetes.io/projected/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-kube-api-access-nrwgb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.802379 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.809219 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.809608 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.809729 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:51 crc kubenswrapper[4772]: I0930 17:45:51.822663 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrwgb\" (UniqueName: \"kubernetes.io/projected/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-kube-api-access-nrwgb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vpf6j\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:52 crc kubenswrapper[4772]: I0930 17:45:52.007251 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:45:52 crc kubenswrapper[4772]: I0930 17:45:52.535855 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j"] Sep 30 17:45:52 crc kubenswrapper[4772]: W0930 17:45:52.538152 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d5bcedc_1eef_4301_ac7f_af49c51fc9f3.slice/crio-77fc1760c304edc5108a4c6056c78c7ee735afb62fabd3699f1e1cc3dc0f93e5 WatchSource:0}: Error finding container 77fc1760c304edc5108a4c6056c78c7ee735afb62fabd3699f1e1cc3dc0f93e5: Status 404 returned error can't find the container with id 77fc1760c304edc5108a4c6056c78c7ee735afb62fabd3699f1e1cc3dc0f93e5 Sep 30 17:45:52 crc kubenswrapper[4772]: I0930 17:45:52.572532 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" event={"ID":"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3","Type":"ContainerStarted","Data":"77fc1760c304edc5108a4c6056c78c7ee735afb62fabd3699f1e1cc3dc0f93e5"} Sep 30 17:45:53 crc kubenswrapper[4772]: I0930 17:45:53.582798 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" event={"ID":"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3","Type":"ContainerStarted","Data":"e139dcf4dc7b5aaafc46eeea288699a626500cf1082bcb2d67e388e425eafef0"} Sep 30 17:45:53 crc kubenswrapper[4772]: I0930 17:45:53.599194 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" podStartSLOduration=2.05058898 podStartE2EDuration="2.599177361s" podCreationTimestamp="2025-09-30 17:45:51 +0000 UTC" firstStartedPulling="2025-09-30 17:45:52.540589823 +0000 UTC m=+2653.447602654" lastFinishedPulling="2025-09-30 17:45:53.089178204 +0000 UTC m=+2653.996191035" observedRunningTime="2025-09-30 17:45:53.596218854 +0000 UTC m=+2654.503231685" watchObservedRunningTime="2025-09-30 17:45:53.599177361 +0000 UTC m=+2654.506190192" Sep 30 17:46:01 crc kubenswrapper[4772]: I0930 17:46:01.669199 4772 generic.go:334] "Generic (PLEG): container finished" podID="2d5bcedc-1eef-4301-ac7f-af49c51fc9f3" containerID="e139dcf4dc7b5aaafc46eeea288699a626500cf1082bcb2d67e388e425eafef0" exitCode=0 Sep 30 17:46:01 crc kubenswrapper[4772]: I0930 17:46:01.669300 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" event={"ID":"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3","Type":"ContainerDied","Data":"e139dcf4dc7b5aaafc46eeea288699a626500cf1082bcb2d67e388e425eafef0"} Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.077460 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.247815 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-inventory\") pod \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.248166 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ssh-key\") pod \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.248224 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ceph\") pod \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.248555 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrwgb\" (UniqueName: \"kubernetes.io/projected/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-kube-api-access-nrwgb\") pod \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\" (UID: \"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3\") " Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.253322 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-kube-api-access-nrwgb" (OuterVolumeSpecName: "kube-api-access-nrwgb") pod "2d5bcedc-1eef-4301-ac7f-af49c51fc9f3" (UID: "2d5bcedc-1eef-4301-ac7f-af49c51fc9f3"). InnerVolumeSpecName "kube-api-access-nrwgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.253788 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ceph" (OuterVolumeSpecName: "ceph") pod "2d5bcedc-1eef-4301-ac7f-af49c51fc9f3" (UID: "2d5bcedc-1eef-4301-ac7f-af49c51fc9f3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.274711 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2d5bcedc-1eef-4301-ac7f-af49c51fc9f3" (UID: "2d5bcedc-1eef-4301-ac7f-af49c51fc9f3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.279686 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-inventory" (OuterVolumeSpecName: "inventory") pod "2d5bcedc-1eef-4301-ac7f-af49c51fc9f3" (UID: "2d5bcedc-1eef-4301-ac7f-af49c51fc9f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.350670 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.350701 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.350709 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.350719 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrwgb\" (UniqueName: \"kubernetes.io/projected/2d5bcedc-1eef-4301-ac7f-af49c51fc9f3-kube-api-access-nrwgb\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.693457 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" event={"ID":"2d5bcedc-1eef-4301-ac7f-af49c51fc9f3","Type":"ContainerDied","Data":"77fc1760c304edc5108a4c6056c78c7ee735afb62fabd3699f1e1cc3dc0f93e5"} Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.693537 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77fc1760c304edc5108a4c6056c78c7ee735afb62fabd3699f1e1cc3dc0f93e5" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.693556 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vpf6j" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.771818 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk"] Sep 30 17:46:03 crc kubenswrapper[4772]: E0930 17:46:03.772371 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5bcedc-1eef-4301-ac7f-af49c51fc9f3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.772394 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5bcedc-1eef-4301-ac7f-af49c51fc9f3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.772627 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d5bcedc-1eef-4301-ac7f-af49c51fc9f3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.773486 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.776563 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.776578 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.776972 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.777207 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.777604 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.787837 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk"] Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.963389 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.963689 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4pcl\" (UniqueName: \"kubernetes.io/projected/267b3439-a782-4c26-b376-19d72ece7ea1-kube-api-access-j4pcl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.963730 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:03 crc kubenswrapper[4772]: I0930 17:46:03.963797 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.065359 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4pcl\" (UniqueName: \"kubernetes.io/projected/267b3439-a782-4c26-b376-19d72ece7ea1-kube-api-access-j4pcl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.065406 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.065427 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.065475 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.069349 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.069339 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.079643 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.082772 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4pcl\" (UniqueName: \"kubernetes.io/projected/267b3439-a782-4c26-b376-19d72ece7ea1-kube-api-access-j4pcl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.097754 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.596804 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk"] Sep 30 17:46:04 crc kubenswrapper[4772]: W0930 17:46:04.604877 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod267b3439_a782_4c26_b376_19d72ece7ea1.slice/crio-98b97cf6cf0d7907e42e80689522ded9544b74483b090c292bfcbd27675c663b WatchSource:0}: Error finding container 98b97cf6cf0d7907e42e80689522ded9544b74483b090c292bfcbd27675c663b: Status 404 returned error can't find the container with id 98b97cf6cf0d7907e42e80689522ded9544b74483b090c292bfcbd27675c663b Sep 30 17:46:04 crc kubenswrapper[4772]: I0930 17:46:04.704111 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" event={"ID":"267b3439-a782-4c26-b376-19d72ece7ea1","Type":"ContainerStarted","Data":"98b97cf6cf0d7907e42e80689522ded9544b74483b090c292bfcbd27675c663b"} Sep 30 17:46:05 crc kubenswrapper[4772]: I0930 17:46:05.716771 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" event={"ID":"267b3439-a782-4c26-b376-19d72ece7ea1","Type":"ContainerStarted","Data":"fcaa414fc0727578ebd549da87294cb1b3dfe0fc61a3a9a47a7a26c95b3246a3"} Sep 30 17:46:05 crc kubenswrapper[4772]: I0930 17:46:05.756694 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" podStartSLOduration=2.310726391 podStartE2EDuration="2.756651117s" podCreationTimestamp="2025-09-30 17:46:03 +0000 UTC" firstStartedPulling="2025-09-30 17:46:04.607654634 +0000 UTC m=+2665.514667465" lastFinishedPulling="2025-09-30 17:46:05.05357936 +0000 UTC m=+2665.960592191" observedRunningTime="2025-09-30 17:46:05.741372956 +0000 UTC m=+2666.648385827" watchObservedRunningTime="2025-09-30 17:46:05.756651117 +0000 UTC m=+2666.663663978" Sep 30 17:46:15 crc kubenswrapper[4772]: I0930 17:46:15.800279 4772 generic.go:334] "Generic (PLEG): container finished" podID="267b3439-a782-4c26-b376-19d72ece7ea1" containerID="fcaa414fc0727578ebd549da87294cb1b3dfe0fc61a3a9a47a7a26c95b3246a3" exitCode=0 Sep 30 17:46:15 crc kubenswrapper[4772]: I0930 17:46:15.800433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" event={"ID":"267b3439-a782-4c26-b376-19d72ece7ea1","Type":"ContainerDied","Data":"fcaa414fc0727578ebd549da87294cb1b3dfe0fc61a3a9a47a7a26c95b3246a3"} Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.221432 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.420169 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-inventory\") pod \"267b3439-a782-4c26-b376-19d72ece7ea1\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.420591 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ssh-key\") pod \"267b3439-a782-4c26-b376-19d72ece7ea1\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.420646 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ceph\") pod \"267b3439-a782-4c26-b376-19d72ece7ea1\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.420846 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4pcl\" (UniqueName: \"kubernetes.io/projected/267b3439-a782-4c26-b376-19d72ece7ea1-kube-api-access-j4pcl\") pod \"267b3439-a782-4c26-b376-19d72ece7ea1\" (UID: \"267b3439-a782-4c26-b376-19d72ece7ea1\") " Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.425937 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ceph" (OuterVolumeSpecName: "ceph") pod "267b3439-a782-4c26-b376-19d72ece7ea1" (UID: "267b3439-a782-4c26-b376-19d72ece7ea1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.426421 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267b3439-a782-4c26-b376-19d72ece7ea1-kube-api-access-j4pcl" (OuterVolumeSpecName: "kube-api-access-j4pcl") pod "267b3439-a782-4c26-b376-19d72ece7ea1" (UID: "267b3439-a782-4c26-b376-19d72ece7ea1"). InnerVolumeSpecName "kube-api-access-j4pcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.457235 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "267b3439-a782-4c26-b376-19d72ece7ea1" (UID: "267b3439-a782-4c26-b376-19d72ece7ea1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.457807 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-inventory" (OuterVolumeSpecName: "inventory") pod "267b3439-a782-4c26-b376-19d72ece7ea1" (UID: "267b3439-a782-4c26-b376-19d72ece7ea1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.523612 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4pcl\" (UniqueName: \"kubernetes.io/projected/267b3439-a782-4c26-b376-19d72ece7ea1-kube-api-access-j4pcl\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.523652 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.523662 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.523672 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/267b3439-a782-4c26-b376-19d72ece7ea1-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.819051 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" event={"ID":"267b3439-a782-4c26-b376-19d72ece7ea1","Type":"ContainerDied","Data":"98b97cf6cf0d7907e42e80689522ded9544b74483b090c292bfcbd27675c663b"} Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.819109 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98b97cf6cf0d7907e42e80689522ded9544b74483b090c292bfcbd27675c663b" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.819124 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.894068 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj"] Sep 30 17:46:17 crc kubenswrapper[4772]: E0930 17:46:17.894494 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267b3439-a782-4c26-b376-19d72ece7ea1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.894517 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="267b3439-a782-4c26-b376-19d72ece7ea1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.894702 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="267b3439-a782-4c26-b376-19d72ece7ea1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.895357 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.899713 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.904005 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.904120 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.904155 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.904164 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.904233 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.904469 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.911341 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.928745 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 30 17:46:17 crc kubenswrapper[4772]: I0930 17:46:17.928765 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj"] Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038096 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038348 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038425 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038513 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038597 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038641 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038701 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038734 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038806 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhc22\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-kube-api-access-qhc22\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038839 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038866 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038892 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038941 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.038976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.039217 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141125 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141374 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141477 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141513 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141554 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhc22\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-kube-api-access-qhc22\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141625 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141689 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141716 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141756 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141789 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141944 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.141995 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.142238 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.142273 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.145922 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.145922 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.148509 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.148602 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.148736 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.148788 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.149432 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.149642 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.150882 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.151290 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.151729 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.158315 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.161152 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.161552 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.163341 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhc22\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-kube-api-access-qhc22\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t42nj\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.236221 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.782300 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj"] Sep 30 17:46:18 crc kubenswrapper[4772]: I0930 17:46:18.827692 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" event={"ID":"32420052-34e9-4cca-a4ee-239d3416cd9a","Type":"ContainerStarted","Data":"92306b4bced5116ec0507f4c17e10a7113b9efb602aabd234225a6d64f34a9b3"} Sep 30 17:46:19 crc kubenswrapper[4772]: I0930 17:46:19.840715 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" event={"ID":"32420052-34e9-4cca-a4ee-239d3416cd9a","Type":"ContainerStarted","Data":"7223d489f438acefe2584a261c8c8e9d25a400c63093167ca1367f4c90c3d70e"} Sep 30 17:46:19 crc kubenswrapper[4772]: I0930 17:46:19.877409 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" podStartSLOduration=2.406203789 podStartE2EDuration="2.877383798s" podCreationTimestamp="2025-09-30 17:46:17 +0000 UTC" firstStartedPulling="2025-09-30 17:46:18.792398996 +0000 UTC m=+2679.699411827" lastFinishedPulling="2025-09-30 17:46:19.263579005 +0000 UTC m=+2680.170591836" observedRunningTime="2025-09-30 17:46:19.862370364 +0000 UTC m=+2680.769383195" watchObservedRunningTime="2025-09-30 17:46:19.877383798 +0000 UTC m=+2680.784396639" Sep 30 17:47:04 crc kubenswrapper[4772]: I0930 17:47:04.196291 4772 generic.go:334] "Generic (PLEG): container finished" podID="32420052-34e9-4cca-a4ee-239d3416cd9a" containerID="7223d489f438acefe2584a261c8c8e9d25a400c63093167ca1367f4c90c3d70e" exitCode=0 Sep 30 17:47:04 crc kubenswrapper[4772]: I0930 17:47:04.196404 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" event={"ID":"32420052-34e9-4cca-a4ee-239d3416cd9a","Type":"ContainerDied","Data":"7223d489f438acefe2584a261c8c8e9d25a400c63093167ca1367f4c90c3d70e"} Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.592290 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740499 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-nova-combined-ca-bundle\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740535 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-telemetry-combined-ca-bundle\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740572 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-repo-setup-combined-ca-bundle\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740590 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-inventory\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740607 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-bootstrap-combined-ca-bundle\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740681 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740717 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-neutron-metadata-combined-ca-bundle\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740764 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740782 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-libvirt-combined-ca-bundle\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740803 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740872 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740898 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ceph\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.740968 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ovn-combined-ca-bundle\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.741034 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ssh-key\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.741064 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhc22\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-kube-api-access-qhc22\") pod \"32420052-34e9-4cca-a4ee-239d3416cd9a\" (UID: \"32420052-34e9-4cca-a4ee-239d3416cd9a\") " Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.748026 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.748428 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-kube-api-access-qhc22" (OuterVolumeSpecName: "kube-api-access-qhc22") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "kube-api-access-qhc22". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.748760 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.749945 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.750815 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ceph" (OuterVolumeSpecName: "ceph") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.750989 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.752242 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.752715 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.753765 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.753808 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.755522 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.767130 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.767330 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.781214 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-inventory" (OuterVolumeSpecName: "inventory") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.787183 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "32420052-34e9-4cca-a4ee-239d3416cd9a" (UID: "32420052-34e9-4cca-a4ee-239d3416cd9a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.842987 4772 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843020 4772 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843031 4772 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843044 4772 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843056 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843066 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843075 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843096 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhc22\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-kube-api-access-qhc22\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843104 4772 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843112 4772 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843121 4772 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843130 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843138 4772 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843148 4772 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/32420052-34e9-4cca-a4ee-239d3416cd9a-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:05 crc kubenswrapper[4772]: I0930 17:47:05.843159 4772 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32420052-34e9-4cca-a4ee-239d3416cd9a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.216404 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" event={"ID":"32420052-34e9-4cca-a4ee-239d3416cd9a","Type":"ContainerDied","Data":"92306b4bced5116ec0507f4c17e10a7113b9efb602aabd234225a6d64f34a9b3"} Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.216449 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t42nj" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.216459 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92306b4bced5116ec0507f4c17e10a7113b9efb602aabd234225a6d64f34a9b3" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.314077 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc"] Sep 30 17:47:06 crc kubenswrapper[4772]: E0930 17:47:06.314501 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32420052-34e9-4cca-a4ee-239d3416cd9a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.314529 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="32420052-34e9-4cca-a4ee-239d3416cd9a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.314746 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="32420052-34e9-4cca-a4ee-239d3416cd9a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.315464 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.326938 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc"] Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.327764 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.328186 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.329646 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.329857 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.331357 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.353883 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.353927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.354216 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22jls\" (UniqueName: \"kubernetes.io/projected/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-kube-api-access-22jls\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.354349 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.455435 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.455478 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.455591 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22jls\" (UniqueName: \"kubernetes.io/projected/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-kube-api-access-22jls\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.455635 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.459747 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.459826 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.459967 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.474309 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22jls\" (UniqueName: \"kubernetes.io/projected/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-kube-api-access-22jls\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:06 crc kubenswrapper[4772]: I0930 17:47:06.632231 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:07 crc kubenswrapper[4772]: I0930 17:47:07.203713 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc"] Sep 30 17:47:07 crc kubenswrapper[4772]: I0930 17:47:07.225961 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" event={"ID":"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4","Type":"ContainerStarted","Data":"934fa9c13aefac869e42a6e8a3d5a8a431005e9ecac6b6f660464dc9f54d3ede"} Sep 30 17:47:08 crc kubenswrapper[4772]: I0930 17:47:08.235748 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" event={"ID":"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4","Type":"ContainerStarted","Data":"48865e45edd2c9212ab186b3cf6b703f5b36678d46fad894090f714084c54422"} Sep 30 17:47:08 crc kubenswrapper[4772]: I0930 17:47:08.255861 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" podStartSLOduration=1.838717777 podStartE2EDuration="2.255843023s" podCreationTimestamp="2025-09-30 17:47:06 +0000 UTC" firstStartedPulling="2025-09-30 17:47:07.203608632 +0000 UTC m=+2728.110621463" lastFinishedPulling="2025-09-30 17:47:07.620733878 +0000 UTC m=+2728.527746709" observedRunningTime="2025-09-30 17:47:08.249921685 +0000 UTC m=+2729.156934516" watchObservedRunningTime="2025-09-30 17:47:08.255843023 +0000 UTC m=+2729.162855854" Sep 30 17:47:14 crc kubenswrapper[4772]: I0930 17:47:14.324404 4772 generic.go:334] "Generic (PLEG): container finished" podID="bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4" containerID="48865e45edd2c9212ab186b3cf6b703f5b36678d46fad894090f714084c54422" exitCode=0 Sep 30 17:47:14 crc kubenswrapper[4772]: I0930 17:47:14.324580 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" event={"ID":"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4","Type":"ContainerDied","Data":"48865e45edd2c9212ab186b3cf6b703f5b36678d46fad894090f714084c54422"} Sep 30 17:47:15 crc kubenswrapper[4772]: I0930 17:47:15.796469 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:15 crc kubenswrapper[4772]: I0930 17:47:15.956930 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-inventory\") pod \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " Sep 30 17:47:15 crc kubenswrapper[4772]: I0930 17:47:15.956982 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ceph\") pod \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " Sep 30 17:47:15 crc kubenswrapper[4772]: I0930 17:47:15.957021 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22jls\" (UniqueName: \"kubernetes.io/projected/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-kube-api-access-22jls\") pod \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " Sep 30 17:47:15 crc kubenswrapper[4772]: I0930 17:47:15.957733 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ssh-key\") pod \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\" (UID: \"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4\") " Sep 30 17:47:15 crc kubenswrapper[4772]: I0930 17:47:15.962082 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ceph" (OuterVolumeSpecName: "ceph") pod "bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4" (UID: "bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:15 crc kubenswrapper[4772]: I0930 17:47:15.963819 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-kube-api-access-22jls" (OuterVolumeSpecName: "kube-api-access-22jls") pod "bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4" (UID: "bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4"). InnerVolumeSpecName "kube-api-access-22jls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:15 crc kubenswrapper[4772]: I0930 17:47:15.984618 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-inventory" (OuterVolumeSpecName: "inventory") pod "bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4" (UID: "bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:15 crc kubenswrapper[4772]: I0930 17:47:15.990894 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4" (UID: "bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.059739 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.059777 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.059788 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22jls\" (UniqueName: \"kubernetes.io/projected/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-kube-api-access-22jls\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.059797 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.354632 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" event={"ID":"bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4","Type":"ContainerDied","Data":"934fa9c13aefac869e42a6e8a3d5a8a431005e9ecac6b6f660464dc9f54d3ede"} Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.354873 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="934fa9c13aefac869e42a6e8a3d5a8a431005e9ecac6b6f660464dc9f54d3ede" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.354680 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.462882 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r"] Sep 30 17:47:16 crc kubenswrapper[4772]: E0930 17:47:16.463717 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.463838 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.464201 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.465154 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.470509 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.470539 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.470759 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.470864 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.470894 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.471054 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.475718 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r"] Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.668576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.669203 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.669309 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.669501 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.669604 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.670730 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts8ss\" (UniqueName: \"kubernetes.io/projected/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-kube-api-access-ts8ss\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.772445 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts8ss\" (UniqueName: \"kubernetes.io/projected/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-kube-api-access-ts8ss\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.772504 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.772532 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.772558 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.772582 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.772604 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.773645 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.778261 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.778468 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.778663 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.783008 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.791012 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts8ss\" (UniqueName: \"kubernetes.io/projected/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-kube-api-access-ts8ss\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-llw7r\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:16 crc kubenswrapper[4772]: I0930 17:47:16.799091 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:47:17 crc kubenswrapper[4772]: I0930 17:47:17.318185 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r"] Sep 30 17:47:17 crc kubenswrapper[4772]: I0930 17:47:17.367400 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" event={"ID":"4e366f6f-7ee6-42c4-8a83-7cba085e2a46","Type":"ContainerStarted","Data":"490e5e07e7ed157e20fddc41b06639ba8cfaf18d6bbd8038952463e176e5f35e"} Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.366452 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s226p"] Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.369503 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.377771 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" event={"ID":"4e366f6f-7ee6-42c4-8a83-7cba085e2a46","Type":"ContainerStarted","Data":"73f60ff06e991cf1da69530e93c451cc091e1dcf976125ea308d3a5e85a54be1"} Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.379206 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s226p"] Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.413792 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-catalog-content\") pod \"community-operators-s226p\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.413889 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-utilities\") pod \"community-operators-s226p\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.414030 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-779cg\" (UniqueName: \"kubernetes.io/projected/3eb43fb4-c218-4ba0-a381-1484bc16637a-kube-api-access-779cg\") pod \"community-operators-s226p\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.425043 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" podStartSLOduration=1.859718392 podStartE2EDuration="2.425019467s" podCreationTimestamp="2025-09-30 17:47:16 +0000 UTC" firstStartedPulling="2025-09-30 17:47:17.327336555 +0000 UTC m=+2738.234349386" lastFinishedPulling="2025-09-30 17:47:17.89263763 +0000 UTC m=+2738.799650461" observedRunningTime="2025-09-30 17:47:18.414892377 +0000 UTC m=+2739.321905208" watchObservedRunningTime="2025-09-30 17:47:18.425019467 +0000 UTC m=+2739.332032308" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.516270 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-779cg\" (UniqueName: \"kubernetes.io/projected/3eb43fb4-c218-4ba0-a381-1484bc16637a-kube-api-access-779cg\") pod \"community-operators-s226p\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.516686 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-catalog-content\") pod \"community-operators-s226p\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.516883 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-utilities\") pod \"community-operators-s226p\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.517526 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-catalog-content\") pod \"community-operators-s226p\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.517537 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-utilities\") pod \"community-operators-s226p\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.533305 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-779cg\" (UniqueName: \"kubernetes.io/projected/3eb43fb4-c218-4ba0-a381-1484bc16637a-kube-api-access-779cg\") pod \"community-operators-s226p\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:18 crc kubenswrapper[4772]: I0930 17:47:18.690360 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:19 crc kubenswrapper[4772]: I0930 17:47:19.254476 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s226p"] Sep 30 17:47:19 crc kubenswrapper[4772]: W0930 17:47:19.262875 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eb43fb4_c218_4ba0_a381_1484bc16637a.slice/crio-d6a8c8c1baf26092b2af60f58ca51484afa4fbc55faee9ab6449f223e445e619 WatchSource:0}: Error finding container d6a8c8c1baf26092b2af60f58ca51484afa4fbc55faee9ab6449f223e445e619: Status 404 returned error can't find the container with id d6a8c8c1baf26092b2af60f58ca51484afa4fbc55faee9ab6449f223e445e619 Sep 30 17:47:19 crc kubenswrapper[4772]: I0930 17:47:19.407749 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s226p" event={"ID":"3eb43fb4-c218-4ba0-a381-1484bc16637a","Type":"ContainerStarted","Data":"d6a8c8c1baf26092b2af60f58ca51484afa4fbc55faee9ab6449f223e445e619"} Sep 30 17:47:20 crc kubenswrapper[4772]: I0930 17:47:20.418349 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s226p" event={"ID":"3eb43fb4-c218-4ba0-a381-1484bc16637a","Type":"ContainerDied","Data":"dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696"} Sep 30 17:47:20 crc kubenswrapper[4772]: I0930 17:47:20.422267 4772 generic.go:334] "Generic (PLEG): container finished" podID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerID="dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696" exitCode=0 Sep 30 17:47:21 crc kubenswrapper[4772]: I0930 17:47:21.460472 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s226p" event={"ID":"3eb43fb4-c218-4ba0-a381-1484bc16637a","Type":"ContainerStarted","Data":"8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb"} Sep 30 17:47:22 crc kubenswrapper[4772]: I0930 17:47:22.471677 4772 generic.go:334] "Generic (PLEG): container finished" podID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerID="8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb" exitCode=0 Sep 30 17:47:22 crc kubenswrapper[4772]: I0930 17:47:22.471716 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s226p" event={"ID":"3eb43fb4-c218-4ba0-a381-1484bc16637a","Type":"ContainerDied","Data":"8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb"} Sep 30 17:47:23 crc kubenswrapper[4772]: I0930 17:47:23.483146 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s226p" event={"ID":"3eb43fb4-c218-4ba0-a381-1484bc16637a","Type":"ContainerStarted","Data":"024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0"} Sep 30 17:47:23 crc kubenswrapper[4772]: I0930 17:47:23.504613 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s226p" podStartSLOduration=3.00575181 podStartE2EDuration="5.50459001s" podCreationTimestamp="2025-09-30 17:47:18 +0000 UTC" firstStartedPulling="2025-09-30 17:47:20.420573436 +0000 UTC m=+2741.327586267" lastFinishedPulling="2025-09-30 17:47:22.919411636 +0000 UTC m=+2743.826424467" observedRunningTime="2025-09-30 17:47:23.497572473 +0000 UTC m=+2744.404585314" watchObservedRunningTime="2025-09-30 17:47:23.50459001 +0000 UTC m=+2744.411602841" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.341665 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zcpxd"] Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.343814 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.353142 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcpxd"] Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.360828 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n7ff\" (UniqueName: \"kubernetes.io/projected/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-kube-api-access-5n7ff\") pod \"redhat-marketplace-zcpxd\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.361123 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-utilities\") pod \"redhat-marketplace-zcpxd\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.361180 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-catalog-content\") pod \"redhat-marketplace-zcpxd\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.462617 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n7ff\" (UniqueName: \"kubernetes.io/projected/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-kube-api-access-5n7ff\") pod \"redhat-marketplace-zcpxd\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.462748 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-utilities\") pod \"redhat-marketplace-zcpxd\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.462783 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-catalog-content\") pod \"redhat-marketplace-zcpxd\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.463324 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-utilities\") pod \"redhat-marketplace-zcpxd\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.463440 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-catalog-content\") pod \"redhat-marketplace-zcpxd\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.493001 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n7ff\" (UniqueName: \"kubernetes.io/projected/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-kube-api-access-5n7ff\") pod \"redhat-marketplace-zcpxd\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:25 crc kubenswrapper[4772]: I0930 17:47:25.675128 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:26 crc kubenswrapper[4772]: I0930 17:47:26.180777 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcpxd"] Sep 30 17:47:26 crc kubenswrapper[4772]: W0930 17:47:26.188359 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac636cfe_2c67_4c9b_9977_fcd53f0ef67d.slice/crio-ff1077d1ac6ffedccdda1e1dfd2a83484b77846bced07ea4e815b1956d2ecb9f WatchSource:0}: Error finding container ff1077d1ac6ffedccdda1e1dfd2a83484b77846bced07ea4e815b1956d2ecb9f: Status 404 returned error can't find the container with id ff1077d1ac6ffedccdda1e1dfd2a83484b77846bced07ea4e815b1956d2ecb9f Sep 30 17:47:26 crc kubenswrapper[4772]: I0930 17:47:26.522618 4772 generic.go:334] "Generic (PLEG): container finished" podID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerID="88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107" exitCode=0 Sep 30 17:47:26 crc kubenswrapper[4772]: I0930 17:47:26.522947 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcpxd" event={"ID":"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d","Type":"ContainerDied","Data":"88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107"} Sep 30 17:47:26 crc kubenswrapper[4772]: I0930 17:47:26.522992 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcpxd" event={"ID":"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d","Type":"ContainerStarted","Data":"ff1077d1ac6ffedccdda1e1dfd2a83484b77846bced07ea4e815b1956d2ecb9f"} Sep 30 17:47:28 crc kubenswrapper[4772]: I0930 17:47:28.541994 4772 generic.go:334] "Generic (PLEG): container finished" podID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerID="ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0" exitCode=0 Sep 30 17:47:28 crc kubenswrapper[4772]: I0930 17:47:28.542111 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcpxd" event={"ID":"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d","Type":"ContainerDied","Data":"ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0"} Sep 30 17:47:28 crc kubenswrapper[4772]: I0930 17:47:28.691189 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:28 crc kubenswrapper[4772]: I0930 17:47:28.691233 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:28 crc kubenswrapper[4772]: I0930 17:47:28.755676 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:29 crc kubenswrapper[4772]: I0930 17:47:29.554222 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcpxd" event={"ID":"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d","Type":"ContainerStarted","Data":"9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f"} Sep 30 17:47:29 crc kubenswrapper[4772]: I0930 17:47:29.571982 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zcpxd" podStartSLOduration=2.157771783 podStartE2EDuration="4.571965638s" podCreationTimestamp="2025-09-30 17:47:25 +0000 UTC" firstStartedPulling="2025-09-30 17:47:26.52577076 +0000 UTC m=+2747.432783611" lastFinishedPulling="2025-09-30 17:47:28.939964635 +0000 UTC m=+2749.846977466" observedRunningTime="2025-09-30 17:47:29.570472838 +0000 UTC m=+2750.477485679" watchObservedRunningTime="2025-09-30 17:47:29.571965638 +0000 UTC m=+2750.478978469" Sep 30 17:47:29 crc kubenswrapper[4772]: I0930 17:47:29.603976 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:30 crc kubenswrapper[4772]: I0930 17:47:30.132702 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s226p"] Sep 30 17:47:31 crc kubenswrapper[4772]: I0930 17:47:31.569410 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s226p" podUID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerName="registry-server" containerID="cri-o://024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0" gracePeriod=2 Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.049953 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.087519 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-779cg\" (UniqueName: \"kubernetes.io/projected/3eb43fb4-c218-4ba0-a381-1484bc16637a-kube-api-access-779cg\") pod \"3eb43fb4-c218-4ba0-a381-1484bc16637a\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.087619 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-catalog-content\") pod \"3eb43fb4-c218-4ba0-a381-1484bc16637a\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.087686 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-utilities\") pod \"3eb43fb4-c218-4ba0-a381-1484bc16637a\" (UID: \"3eb43fb4-c218-4ba0-a381-1484bc16637a\") " Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.088514 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-utilities" (OuterVolumeSpecName: "utilities") pod "3eb43fb4-c218-4ba0-a381-1484bc16637a" (UID: "3eb43fb4-c218-4ba0-a381-1484bc16637a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.088720 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.099459 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb43fb4-c218-4ba0-a381-1484bc16637a-kube-api-access-779cg" (OuterVolumeSpecName: "kube-api-access-779cg") pod "3eb43fb4-c218-4ba0-a381-1484bc16637a" (UID: "3eb43fb4-c218-4ba0-a381-1484bc16637a"). InnerVolumeSpecName "kube-api-access-779cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.190536 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-779cg\" (UniqueName: \"kubernetes.io/projected/3eb43fb4-c218-4ba0-a381-1484bc16637a-kube-api-access-779cg\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.587836 4772 generic.go:334] "Generic (PLEG): container finished" podID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerID="024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0" exitCode=0 Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.587891 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s226p" event={"ID":"3eb43fb4-c218-4ba0-a381-1484bc16637a","Type":"ContainerDied","Data":"024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0"} Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.587920 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s226p" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.587974 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s226p" event={"ID":"3eb43fb4-c218-4ba0-a381-1484bc16637a","Type":"ContainerDied","Data":"d6a8c8c1baf26092b2af60f58ca51484afa4fbc55faee9ab6449f223e445e619"} Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.587997 4772 scope.go:117] "RemoveContainer" containerID="024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.619960 4772 scope.go:117] "RemoveContainer" containerID="8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.625261 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3eb43fb4-c218-4ba0-a381-1484bc16637a" (UID: "3eb43fb4-c218-4ba0-a381-1484bc16637a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.641821 4772 scope.go:117] "RemoveContainer" containerID="dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.690493 4772 scope.go:117] "RemoveContainer" containerID="024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0" Sep 30 17:47:32 crc kubenswrapper[4772]: E0930 17:47:32.690798 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0\": container with ID starting with 024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0 not found: ID does not exist" containerID="024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.690828 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0"} err="failed to get container status \"024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0\": rpc error: code = NotFound desc = could not find container \"024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0\": container with ID starting with 024c693cf727b050883ff51fc0f86983c0d0d279408ecf298a671845f662e4f0 not found: ID does not exist" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.690847 4772 scope.go:117] "RemoveContainer" containerID="8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb" Sep 30 17:47:32 crc kubenswrapper[4772]: E0930 17:47:32.691380 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb\": container with ID starting with 8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb not found: ID does not exist" containerID="8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.691417 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb"} err="failed to get container status \"8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb\": rpc error: code = NotFound desc = could not find container \"8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb\": container with ID starting with 8b785294644186d9edc61a3551b87ca084ae6992f9a47e3e79daca34184b4bcb not found: ID does not exist" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.691444 4772 scope.go:117] "RemoveContainer" containerID="dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696" Sep 30 17:47:32 crc kubenswrapper[4772]: E0930 17:47:32.691717 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696\": container with ID starting with dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696 not found: ID does not exist" containerID="dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.691745 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696"} err="failed to get container status \"dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696\": rpc error: code = NotFound desc = could not find container \"dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696\": container with ID starting with dd756ad09e6eb15bfbd95d902b04bcf753b189ed13428db8e38669a3f36f1696 not found: ID does not exist" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.702816 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb43fb4-c218-4ba0-a381-1484bc16637a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.930966 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s226p"] Sep 30 17:47:32 crc kubenswrapper[4772]: I0930 17:47:32.938196 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s226p"] Sep 30 17:47:33 crc kubenswrapper[4772]: I0930 17:47:33.913156 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb43fb4-c218-4ba0-a381-1484bc16637a" path="/var/lib/kubelet/pods/3eb43fb4-c218-4ba0-a381-1484bc16637a/volumes" Sep 30 17:47:35 crc kubenswrapper[4772]: I0930 17:47:35.675490 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:35 crc kubenswrapper[4772]: I0930 17:47:35.676186 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:35 crc kubenswrapper[4772]: I0930 17:47:35.749581 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:36 crc kubenswrapper[4772]: I0930 17:47:36.666213 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:36 crc kubenswrapper[4772]: I0930 17:47:36.716538 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcpxd"] Sep 30 17:47:38 crc kubenswrapper[4772]: I0930 17:47:38.642437 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zcpxd" podUID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerName="registry-server" containerID="cri-o://9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f" gracePeriod=2 Sep 30 17:47:38 crc kubenswrapper[4772]: I0930 17:47:38.655215 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:47:38 crc kubenswrapper[4772]: I0930 17:47:38.655264 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.069118 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.227748 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-catalog-content\") pod \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.227811 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-utilities\") pod \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.227848 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n7ff\" (UniqueName: \"kubernetes.io/projected/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-kube-api-access-5n7ff\") pod \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\" (UID: \"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d\") " Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.228745 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-utilities" (OuterVolumeSpecName: "utilities") pod "ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" (UID: "ac636cfe-2c67-4c9b-9977-fcd53f0ef67d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.234261 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-kube-api-access-5n7ff" (OuterVolumeSpecName: "kube-api-access-5n7ff") pod "ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" (UID: "ac636cfe-2c67-4c9b-9977-fcd53f0ef67d"). InnerVolumeSpecName "kube-api-access-5n7ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.242961 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" (UID: "ac636cfe-2c67-4c9b-9977-fcd53f0ef67d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.330615 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.330693 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.330703 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n7ff\" (UniqueName: \"kubernetes.io/projected/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d-kube-api-access-5n7ff\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.653112 4772 generic.go:334] "Generic (PLEG): container finished" podID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerID="9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f" exitCode=0 Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.653170 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcpxd" event={"ID":"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d","Type":"ContainerDied","Data":"9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f"} Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.653205 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcpxd" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.653234 4772 scope.go:117] "RemoveContainer" containerID="9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.653217 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcpxd" event={"ID":"ac636cfe-2c67-4c9b-9977-fcd53f0ef67d","Type":"ContainerDied","Data":"ff1077d1ac6ffedccdda1e1dfd2a83484b77846bced07ea4e815b1956d2ecb9f"} Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.680044 4772 scope.go:117] "RemoveContainer" containerID="ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.687481 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcpxd"] Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.694582 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcpxd"] Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.706900 4772 scope.go:117] "RemoveContainer" containerID="88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.805244 4772 scope.go:117] "RemoveContainer" containerID="9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f" Sep 30 17:47:39 crc kubenswrapper[4772]: E0930 17:47:39.807547 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f\": container with ID starting with 9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f not found: ID does not exist" containerID="9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.808091 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f"} err="failed to get container status \"9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f\": rpc error: code = NotFound desc = could not find container \"9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f\": container with ID starting with 9b98bd33e772d98f9310157e7add4b3ba5bb004159eb837e6c74d9b28e97f40f not found: ID does not exist" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.808132 4772 scope.go:117] "RemoveContainer" containerID="ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0" Sep 30 17:47:39 crc kubenswrapper[4772]: E0930 17:47:39.808519 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0\": container with ID starting with ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0 not found: ID does not exist" containerID="ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.808583 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0"} err="failed to get container status \"ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0\": rpc error: code = NotFound desc = could not find container \"ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0\": container with ID starting with ab6f6bd38f84613d194421a3dc8cbe0b155562698906804437ef81a36038acf0 not found: ID does not exist" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.808612 4772 scope.go:117] "RemoveContainer" containerID="88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107" Sep 30 17:47:39 crc kubenswrapper[4772]: E0930 17:47:39.808926 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107\": container with ID starting with 88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107 not found: ID does not exist" containerID="88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.808963 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107"} err="failed to get container status \"88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107\": rpc error: code = NotFound desc = could not find container \"88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107\": container with ID starting with 88e0fecde10e42c780bdc2507508b9c546c4a0a485d2ddda895abc3954519107 not found: ID does not exist" Sep 30 17:47:39 crc kubenswrapper[4772]: I0930 17:47:39.909221 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" path="/var/lib/kubelet/pods/ac636cfe-2c67-4c9b-9977-fcd53f0ef67d/volumes" Sep 30 17:48:08 crc kubenswrapper[4772]: I0930 17:48:08.654977 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:48:08 crc kubenswrapper[4772]: I0930 17:48:08.655788 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:48:38 crc kubenswrapper[4772]: I0930 17:48:38.655777 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:48:38 crc kubenswrapper[4772]: I0930 17:48:38.656357 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:48:38 crc kubenswrapper[4772]: I0930 17:48:38.656405 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:48:38 crc kubenswrapper[4772]: I0930 17:48:38.657348 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8046870d891ddfdf411a3278e2dc77f693a1e3a362e02818c60160e1e36f3da0"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:48:38 crc kubenswrapper[4772]: I0930 17:48:38.657403 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://8046870d891ddfdf411a3278e2dc77f693a1e3a362e02818c60160e1e36f3da0" gracePeriod=600 Sep 30 17:48:39 crc kubenswrapper[4772]: I0930 17:48:39.173929 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="8046870d891ddfdf411a3278e2dc77f693a1e3a362e02818c60160e1e36f3da0" exitCode=0 Sep 30 17:48:39 crc kubenswrapper[4772]: I0930 17:48:39.174021 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"8046870d891ddfdf411a3278e2dc77f693a1e3a362e02818c60160e1e36f3da0"} Sep 30 17:48:39 crc kubenswrapper[4772]: I0930 17:48:39.174337 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe"} Sep 30 17:48:39 crc kubenswrapper[4772]: I0930 17:48:39.175161 4772 scope.go:117] "RemoveContainer" containerID="5bf3060bfca1ba22144a857644ddd0d91f6de5563995233629ede09f6657d81f" Sep 30 17:48:40 crc kubenswrapper[4772]: I0930 17:48:40.182880 4772 generic.go:334] "Generic (PLEG): container finished" podID="4e366f6f-7ee6-42c4-8a83-7cba085e2a46" containerID="73f60ff06e991cf1da69530e93c451cc091e1dcf976125ea308d3a5e85a54be1" exitCode=0 Sep 30 17:48:40 crc kubenswrapper[4772]: I0930 17:48:40.182968 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" event={"ID":"4e366f6f-7ee6-42c4-8a83-7cba085e2a46","Type":"ContainerDied","Data":"73f60ff06e991cf1da69530e93c451cc091e1dcf976125ea308d3a5e85a54be1"} Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.602283 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.657170 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-inventory\") pod \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.657489 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovn-combined-ca-bundle\") pod \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.657610 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ssh-key\") pod \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.657759 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ceph\") pod \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.658038 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts8ss\" (UniqueName: \"kubernetes.io/projected/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-kube-api-access-ts8ss\") pod \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.658232 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovncontroller-config-0\") pod \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\" (UID: \"4e366f6f-7ee6-42c4-8a83-7cba085e2a46\") " Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.663536 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ceph" (OuterVolumeSpecName: "ceph") pod "4e366f6f-7ee6-42c4-8a83-7cba085e2a46" (UID: "4e366f6f-7ee6-42c4-8a83-7cba085e2a46"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.663929 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4e366f6f-7ee6-42c4-8a83-7cba085e2a46" (UID: "4e366f6f-7ee6-42c4-8a83-7cba085e2a46"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.664001 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-kube-api-access-ts8ss" (OuterVolumeSpecName: "kube-api-access-ts8ss") pod "4e366f6f-7ee6-42c4-8a83-7cba085e2a46" (UID: "4e366f6f-7ee6-42c4-8a83-7cba085e2a46"). InnerVolumeSpecName "kube-api-access-ts8ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.686799 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e366f6f-7ee6-42c4-8a83-7cba085e2a46" (UID: "4e366f6f-7ee6-42c4-8a83-7cba085e2a46"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.688867 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-inventory" (OuterVolumeSpecName: "inventory") pod "4e366f6f-7ee6-42c4-8a83-7cba085e2a46" (UID: "4e366f6f-7ee6-42c4-8a83-7cba085e2a46"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.689897 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "4e366f6f-7ee6-42c4-8a83-7cba085e2a46" (UID: "4e366f6f-7ee6-42c4-8a83-7cba085e2a46"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.761068 4772 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.761115 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.761129 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.761145 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.761158 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:41 crc kubenswrapper[4772]: I0930 17:48:41.761168 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts8ss\" (UniqueName: \"kubernetes.io/projected/4e366f6f-7ee6-42c4-8a83-7cba085e2a46-kube-api-access-ts8ss\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.203136 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" event={"ID":"4e366f6f-7ee6-42c4-8a83-7cba085e2a46","Type":"ContainerDied","Data":"490e5e07e7ed157e20fddc41b06639ba8cfaf18d6bbd8038952463e176e5f35e"} Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.203705 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="490e5e07e7ed157e20fddc41b06639ba8cfaf18d6bbd8038952463e176e5f35e" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.203275 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-llw7r" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293041 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf"] Sep 30 17:48:42 crc kubenswrapper[4772]: E0930 17:48:42.293583 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerName="registry-server" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293608 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerName="registry-server" Sep 30 17:48:42 crc kubenswrapper[4772]: E0930 17:48:42.293626 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerName="extract-utilities" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293633 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerName="extract-utilities" Sep 30 17:48:42 crc kubenswrapper[4772]: E0930 17:48:42.293653 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e366f6f-7ee6-42c4-8a83-7cba085e2a46" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293659 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e366f6f-7ee6-42c4-8a83-7cba085e2a46" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 17:48:42 crc kubenswrapper[4772]: E0930 17:48:42.293671 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerName="extract-utilities" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293677 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerName="extract-utilities" Sep 30 17:48:42 crc kubenswrapper[4772]: E0930 17:48:42.293693 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerName="extract-content" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293699 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerName="extract-content" Sep 30 17:48:42 crc kubenswrapper[4772]: E0930 17:48:42.293707 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerName="extract-content" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293712 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerName="extract-content" Sep 30 17:48:42 crc kubenswrapper[4772]: E0930 17:48:42.293728 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerName="registry-server" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293733 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerName="registry-server" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293929 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb43fb4-c218-4ba0-a381-1484bc16637a" containerName="registry-server" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293964 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e366f6f-7ee6-42c4-8a83-7cba085e2a46" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.293977 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac636cfe-2c67-4c9b-9977-fcd53f0ef67d" containerName="registry-server" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.295462 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.300291 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.300469 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.300479 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.300608 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.300652 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.300738 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.300865 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.316595 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf"] Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.372700 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.372783 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.372806 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hsvp\" (UniqueName: \"kubernetes.io/projected/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-kube-api-access-9hsvp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.372833 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.372945 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.372983 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.373010 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.474804 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hsvp\" (UniqueName: \"kubernetes.io/projected/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-kube-api-access-9hsvp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.474856 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.474911 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.474938 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.474956 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.475038 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.475081 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.480410 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.480544 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.481250 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.481591 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.485753 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.493682 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.493917 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hsvp\" (UniqueName: \"kubernetes.io/projected/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-kube-api-access-9hsvp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:42 crc kubenswrapper[4772]: I0930 17:48:42.622193 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:48:43 crc kubenswrapper[4772]: I0930 17:48:43.132583 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf"] Sep 30 17:48:43 crc kubenswrapper[4772]: I0930 17:48:43.211757 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" event={"ID":"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb","Type":"ContainerStarted","Data":"21a0e8018ffd67f25c04925b91e0d74526af8b3cd2fc04481393422991a3561b"} Sep 30 17:48:44 crc kubenswrapper[4772]: I0930 17:48:44.222318 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" event={"ID":"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb","Type":"ContainerStarted","Data":"505204bc2f4cb394c92aa1589e495917dc612631ca40303537a6f84ab35c077b"} Sep 30 17:48:44 crc kubenswrapper[4772]: I0930 17:48:44.247719 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" podStartSLOduration=1.567164054 podStartE2EDuration="2.24770105s" podCreationTimestamp="2025-09-30 17:48:42 +0000 UTC" firstStartedPulling="2025-09-30 17:48:43.124343204 +0000 UTC m=+2824.031356035" lastFinishedPulling="2025-09-30 17:48:43.8048802 +0000 UTC m=+2824.711893031" observedRunningTime="2025-09-30 17:48:44.241631249 +0000 UTC m=+2825.148644100" watchObservedRunningTime="2025-09-30 17:48:44.24770105 +0000 UTC m=+2825.154713881" Sep 30 17:49:53 crc kubenswrapper[4772]: I0930 17:49:53.814489 4772 generic.go:334] "Generic (PLEG): container finished" podID="6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" containerID="505204bc2f4cb394c92aa1589e495917dc612631ca40303537a6f84ab35c077b" exitCode=0 Sep 30 17:49:53 crc kubenswrapper[4772]: I0930 17:49:53.814585 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" event={"ID":"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb","Type":"ContainerDied","Data":"505204bc2f4cb394c92aa1589e495917dc612631ca40303537a6f84ab35c077b"} Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.249391 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.373647 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hsvp\" (UniqueName: \"kubernetes.io/projected/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-kube-api-access-9hsvp\") pod \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.373721 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-nova-metadata-neutron-config-0\") pod \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.373754 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.373840 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-metadata-combined-ca-bundle\") pod \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.373897 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ssh-key\") pod \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.373951 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ceph\") pod \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.374023 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-inventory\") pod \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\" (UID: \"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb\") " Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.381370 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-kube-api-access-9hsvp" (OuterVolumeSpecName: "kube-api-access-9hsvp") pod "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" (UID: "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb"). InnerVolumeSpecName "kube-api-access-9hsvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.393445 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ceph" (OuterVolumeSpecName: "ceph") pod "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" (UID: "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.394707 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" (UID: "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.410143 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" (UID: "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.418357 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-inventory" (OuterVolumeSpecName: "inventory") pod "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" (UID: "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.424531 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" (UID: "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.435042 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" (UID: "6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.476503 4772 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.476790 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.476904 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.476996 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.477104 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hsvp\" (UniqueName: \"kubernetes.io/projected/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-kube-api-access-9hsvp\") on node \"crc\" DevicePath \"\"" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.477197 4772 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.477288 4772 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.833617 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" event={"ID":"6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb","Type":"ContainerDied","Data":"21a0e8018ffd67f25c04925b91e0d74526af8b3cd2fc04481393422991a3561b"} Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.833939 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21a0e8018ffd67f25c04925b91e0d74526af8b3cd2fc04481393422991a3561b" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.833731 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.974738 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz"] Sep 30 17:49:55 crc kubenswrapper[4772]: E0930 17:49:55.975186 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.975210 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.975506 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.976773 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.979247 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.979359 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.979581 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.980339 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.980527 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.980798 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:49:55 crc kubenswrapper[4772]: I0930 17:49:55.994724 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz"] Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.086707 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.086762 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.086796 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.086828 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.086853 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.086932 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f8n9\" (UniqueName: \"kubernetes.io/projected/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-kube-api-access-5f8n9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.188186 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.188250 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.188295 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.188318 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.188336 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.188401 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f8n9\" (UniqueName: \"kubernetes.io/projected/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-kube-api-access-5f8n9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.193260 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.193783 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.194165 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.194622 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.194986 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.206948 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f8n9\" (UniqueName: \"kubernetes.io/projected/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-kube-api-access-5f8n9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.293222 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.832242 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz"] Sep 30 17:49:56 crc kubenswrapper[4772]: I0930 17:49:56.848139 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" event={"ID":"cc4ef050-7f47-4f1f-a62e-4607d290ddf3","Type":"ContainerStarted","Data":"e759ce77b6bca0bdf640bcbef1c8f6c67dead2aab91f2b0959b66c3888cf8086"} Sep 30 17:49:57 crc kubenswrapper[4772]: I0930 17:49:57.857103 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" event={"ID":"cc4ef050-7f47-4f1f-a62e-4607d290ddf3","Type":"ContainerStarted","Data":"bb40c4a6f0c2a86766c15c61aa71c2e0a232d03a6213edaff512421ed5a26b20"} Sep 30 17:49:57 crc kubenswrapper[4772]: I0930 17:49:57.879192 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" podStartSLOduration=2.328474531 podStartE2EDuration="2.879177216s" podCreationTimestamp="2025-09-30 17:49:55 +0000 UTC" firstStartedPulling="2025-09-30 17:49:56.840098366 +0000 UTC m=+2897.747111197" lastFinishedPulling="2025-09-30 17:49:57.390801051 +0000 UTC m=+2898.297813882" observedRunningTime="2025-09-30 17:49:57.873336731 +0000 UTC m=+2898.780349582" watchObservedRunningTime="2025-09-30 17:49:57.879177216 +0000 UTC m=+2898.786190047" Sep 30 17:50:36 crc kubenswrapper[4772]: I0930 17:50:36.874854 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s2hpj"] Sep 30 17:50:36 crc kubenswrapper[4772]: I0930 17:50:36.877225 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:36 crc kubenswrapper[4772]: I0930 17:50:36.888257 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s2hpj"] Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.002869 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2v75\" (UniqueName: \"kubernetes.io/projected/262f9cc7-e68f-4c93-a4ad-de613734fb74-kube-api-access-k2v75\") pod \"redhat-operators-s2hpj\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.003162 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-utilities\") pod \"redhat-operators-s2hpj\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.003229 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-catalog-content\") pod \"redhat-operators-s2hpj\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.104862 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2v75\" (UniqueName: \"kubernetes.io/projected/262f9cc7-e68f-4c93-a4ad-de613734fb74-kube-api-access-k2v75\") pod \"redhat-operators-s2hpj\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.105027 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-utilities\") pod \"redhat-operators-s2hpj\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.105087 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-catalog-content\") pod \"redhat-operators-s2hpj\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.105524 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-utilities\") pod \"redhat-operators-s2hpj\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.105619 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-catalog-content\") pod \"redhat-operators-s2hpj\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.127119 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2v75\" (UniqueName: \"kubernetes.io/projected/262f9cc7-e68f-4c93-a4ad-de613734fb74-kube-api-access-k2v75\") pod \"redhat-operators-s2hpj\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.213941 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:37 crc kubenswrapper[4772]: I0930 17:50:37.651938 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s2hpj"] Sep 30 17:50:37 crc kubenswrapper[4772]: W0930 17:50:37.655228 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod262f9cc7_e68f_4c93_a4ad_de613734fb74.slice/crio-99a0676be875bf5b7c749b0078ec2ab8b13eb607c159e0e69e01324d6918b948 WatchSource:0}: Error finding container 99a0676be875bf5b7c749b0078ec2ab8b13eb607c159e0e69e01324d6918b948: Status 404 returned error can't find the container with id 99a0676be875bf5b7c749b0078ec2ab8b13eb607c159e0e69e01324d6918b948 Sep 30 17:50:38 crc kubenswrapper[4772]: I0930 17:50:38.205334 4772 generic.go:334] "Generic (PLEG): container finished" podID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerID="6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138" exitCode=0 Sep 30 17:50:38 crc kubenswrapper[4772]: I0930 17:50:38.205409 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2hpj" event={"ID":"262f9cc7-e68f-4c93-a4ad-de613734fb74","Type":"ContainerDied","Data":"6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138"} Sep 30 17:50:38 crc kubenswrapper[4772]: I0930 17:50:38.205473 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2hpj" event={"ID":"262f9cc7-e68f-4c93-a4ad-de613734fb74","Type":"ContainerStarted","Data":"99a0676be875bf5b7c749b0078ec2ab8b13eb607c159e0e69e01324d6918b948"} Sep 30 17:50:38 crc kubenswrapper[4772]: I0930 17:50:38.656594 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:50:38 crc kubenswrapper[4772]: I0930 17:50:38.656966 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:50:40 crc kubenswrapper[4772]: I0930 17:50:40.234384 4772 generic.go:334] "Generic (PLEG): container finished" podID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerID="276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6" exitCode=0 Sep 30 17:50:40 crc kubenswrapper[4772]: I0930 17:50:40.234519 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2hpj" event={"ID":"262f9cc7-e68f-4c93-a4ad-de613734fb74","Type":"ContainerDied","Data":"276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6"} Sep 30 17:50:40 crc kubenswrapper[4772]: I0930 17:50:40.237001 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:50:42 crc kubenswrapper[4772]: I0930 17:50:42.252215 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2hpj" event={"ID":"262f9cc7-e68f-4c93-a4ad-de613734fb74","Type":"ContainerStarted","Data":"a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0"} Sep 30 17:50:42 crc kubenswrapper[4772]: I0930 17:50:42.277275 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s2hpj" podStartSLOduration=3.138036907 podStartE2EDuration="6.277249283s" podCreationTimestamp="2025-09-30 17:50:36 +0000 UTC" firstStartedPulling="2025-09-30 17:50:38.20938331 +0000 UTC m=+2939.116396141" lastFinishedPulling="2025-09-30 17:50:41.348595686 +0000 UTC m=+2942.255608517" observedRunningTime="2025-09-30 17:50:42.273572585 +0000 UTC m=+2943.180585416" watchObservedRunningTime="2025-09-30 17:50:42.277249283 +0000 UTC m=+2943.184262114" Sep 30 17:50:47 crc kubenswrapper[4772]: I0930 17:50:47.214971 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:47 crc kubenswrapper[4772]: I0930 17:50:47.215618 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:47 crc kubenswrapper[4772]: I0930 17:50:47.275114 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:47 crc kubenswrapper[4772]: I0930 17:50:47.363489 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:47 crc kubenswrapper[4772]: I0930 17:50:47.519360 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s2hpj"] Sep 30 17:50:49 crc kubenswrapper[4772]: I0930 17:50:49.308813 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s2hpj" podUID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerName="registry-server" containerID="cri-o://a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0" gracePeriod=2 Sep 30 17:50:49 crc kubenswrapper[4772]: I0930 17:50:49.769624 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:49 crc kubenswrapper[4772]: I0930 17:50:49.874117 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2v75\" (UniqueName: \"kubernetes.io/projected/262f9cc7-e68f-4c93-a4ad-de613734fb74-kube-api-access-k2v75\") pod \"262f9cc7-e68f-4c93-a4ad-de613734fb74\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " Sep 30 17:50:49 crc kubenswrapper[4772]: I0930 17:50:49.874481 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-catalog-content\") pod \"262f9cc7-e68f-4c93-a4ad-de613734fb74\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " Sep 30 17:50:49 crc kubenswrapper[4772]: I0930 17:50:49.874637 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-utilities\") pod \"262f9cc7-e68f-4c93-a4ad-de613734fb74\" (UID: \"262f9cc7-e68f-4c93-a4ad-de613734fb74\") " Sep 30 17:50:49 crc kubenswrapper[4772]: I0930 17:50:49.875513 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-utilities" (OuterVolumeSpecName: "utilities") pod "262f9cc7-e68f-4c93-a4ad-de613734fb74" (UID: "262f9cc7-e68f-4c93-a4ad-de613734fb74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:50:49 crc kubenswrapper[4772]: I0930 17:50:49.883474 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/262f9cc7-e68f-4c93-a4ad-de613734fb74-kube-api-access-k2v75" (OuterVolumeSpecName: "kube-api-access-k2v75") pod "262f9cc7-e68f-4c93-a4ad-de613734fb74" (UID: "262f9cc7-e68f-4c93-a4ad-de613734fb74"). InnerVolumeSpecName "kube-api-access-k2v75". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:50:49 crc kubenswrapper[4772]: I0930 17:50:49.977719 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:50:49 crc kubenswrapper[4772]: I0930 17:50:49.977770 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2v75\" (UniqueName: \"kubernetes.io/projected/262f9cc7-e68f-4c93-a4ad-de613734fb74-kube-api-access-k2v75\") on node \"crc\" DevicePath \"\"" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.319107 4772 generic.go:334] "Generic (PLEG): container finished" podID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerID="a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0" exitCode=0 Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.319161 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2hpj" event={"ID":"262f9cc7-e68f-4c93-a4ad-de613734fb74","Type":"ContainerDied","Data":"a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0"} Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.319189 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2hpj" event={"ID":"262f9cc7-e68f-4c93-a4ad-de613734fb74","Type":"ContainerDied","Data":"99a0676be875bf5b7c749b0078ec2ab8b13eb607c159e0e69e01324d6918b948"} Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.319207 4772 scope.go:117] "RemoveContainer" containerID="a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.319345 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2hpj" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.339688 4772 scope.go:117] "RemoveContainer" containerID="276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.360636 4772 scope.go:117] "RemoveContainer" containerID="6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.411947 4772 scope.go:117] "RemoveContainer" containerID="a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0" Sep 30 17:50:50 crc kubenswrapper[4772]: E0930 17:50:50.412421 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0\": container with ID starting with a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0 not found: ID does not exist" containerID="a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.412456 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0"} err="failed to get container status \"a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0\": rpc error: code = NotFound desc = could not find container \"a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0\": container with ID starting with a2291956675a1149495f353ef5e26a76c38e949c6403cb1f94136319c801fec0 not found: ID does not exist" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.412477 4772 scope.go:117] "RemoveContainer" containerID="276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6" Sep 30 17:50:50 crc kubenswrapper[4772]: E0930 17:50:50.412757 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6\": container with ID starting with 276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6 not found: ID does not exist" containerID="276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.412784 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6"} err="failed to get container status \"276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6\": rpc error: code = NotFound desc = could not find container \"276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6\": container with ID starting with 276c6970024d414fcc0b59e00308d3eedff622fca1bc8cd67db48bead9daeed6 not found: ID does not exist" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.412801 4772 scope.go:117] "RemoveContainer" containerID="6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138" Sep 30 17:50:50 crc kubenswrapper[4772]: E0930 17:50:50.413128 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138\": container with ID starting with 6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138 not found: ID does not exist" containerID="6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.413152 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138"} err="failed to get container status \"6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138\": rpc error: code = NotFound desc = could not find container \"6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138\": container with ID starting with 6975ec32f4687edfca5b9296a2211b46b2043acc2b409cf4c35b5dc0c3f50138 not found: ID does not exist" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.449798 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "262f9cc7-e68f-4c93-a4ad-de613734fb74" (UID: "262f9cc7-e68f-4c93-a4ad-de613734fb74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.488038 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/262f9cc7-e68f-4c93-a4ad-de613734fb74-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.656336 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s2hpj"] Sep 30 17:50:50 crc kubenswrapper[4772]: I0930 17:50:50.665355 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s2hpj"] Sep 30 17:50:51 crc kubenswrapper[4772]: I0930 17:50:51.920722 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="262f9cc7-e68f-4c93-a4ad-de613734fb74" path="/var/lib/kubelet/pods/262f9cc7-e68f-4c93-a4ad-de613734fb74/volumes" Sep 30 17:51:08 crc kubenswrapper[4772]: I0930 17:51:08.655783 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:51:08 crc kubenswrapper[4772]: I0930 17:51:08.656328 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:51:38 crc kubenswrapper[4772]: I0930 17:51:38.655438 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:51:38 crc kubenswrapper[4772]: I0930 17:51:38.656045 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:51:38 crc kubenswrapper[4772]: I0930 17:51:38.656153 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 17:51:38 crc kubenswrapper[4772]: I0930 17:51:38.656927 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:51:38 crc kubenswrapper[4772]: I0930 17:51:38.656990 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" gracePeriod=600 Sep 30 17:51:38 crc kubenswrapper[4772]: E0930 17:51:38.862104 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:51:39 crc kubenswrapper[4772]: I0930 17:51:39.767007 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" exitCode=0 Sep 30 17:51:39 crc kubenswrapper[4772]: I0930 17:51:39.767074 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe"} Sep 30 17:51:39 crc kubenswrapper[4772]: I0930 17:51:39.767366 4772 scope.go:117] "RemoveContainer" containerID="8046870d891ddfdf411a3278e2dc77f693a1e3a362e02818c60160e1e36f3da0" Sep 30 17:51:39 crc kubenswrapper[4772]: I0930 17:51:39.767905 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:51:39 crc kubenswrapper[4772]: E0930 17:51:39.768237 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:51:51 crc kubenswrapper[4772]: I0930 17:51:51.898617 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:51:51 crc kubenswrapper[4772]: E0930 17:51:51.899539 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:52:03 crc kubenswrapper[4772]: I0930 17:52:03.898618 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:52:03 crc kubenswrapper[4772]: E0930 17:52:03.899574 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:52:14 crc kubenswrapper[4772]: I0930 17:52:14.898402 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:52:14 crc kubenswrapper[4772]: E0930 17:52:14.899201 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:52:27 crc kubenswrapper[4772]: I0930 17:52:27.898638 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:52:27 crc kubenswrapper[4772]: E0930 17:52:27.899409 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:52:38 crc kubenswrapper[4772]: I0930 17:52:38.898952 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:52:38 crc kubenswrapper[4772]: E0930 17:52:38.899852 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:52:51 crc kubenswrapper[4772]: I0930 17:52:51.898208 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:52:51 crc kubenswrapper[4772]: E0930 17:52:51.899254 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:53:02 crc kubenswrapper[4772]: I0930 17:53:02.898154 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:53:02 crc kubenswrapper[4772]: E0930 17:53:02.899539 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:53:15 crc kubenswrapper[4772]: I0930 17:53:15.898529 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:53:15 crc kubenswrapper[4772]: E0930 17:53:15.899219 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:53:30 crc kubenswrapper[4772]: I0930 17:53:30.897805 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:53:30 crc kubenswrapper[4772]: E0930 17:53:30.898667 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:53:42 crc kubenswrapper[4772]: I0930 17:53:42.898395 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:53:42 crc kubenswrapper[4772]: E0930 17:53:42.899221 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:53:54 crc kubenswrapper[4772]: I0930 17:53:54.898007 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:53:54 crc kubenswrapper[4772]: E0930 17:53:54.898782 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:54:09 crc kubenswrapper[4772]: I0930 17:54:09.906716 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:54:09 crc kubenswrapper[4772]: E0930 17:54:09.907652 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:54:24 crc kubenswrapper[4772]: I0930 17:54:24.899429 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:54:24 crc kubenswrapper[4772]: E0930 17:54:24.900298 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:54:37 crc kubenswrapper[4772]: I0930 17:54:37.898428 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:54:37 crc kubenswrapper[4772]: E0930 17:54:37.899252 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:54:48 crc kubenswrapper[4772]: I0930 17:54:48.898174 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:54:48 crc kubenswrapper[4772]: E0930 17:54:48.898762 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:55:01 crc kubenswrapper[4772]: I0930 17:55:01.898774 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:55:01 crc kubenswrapper[4772]: E0930 17:55:01.899776 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:55:06 crc kubenswrapper[4772]: I0930 17:55:06.677481 4772 generic.go:334] "Generic (PLEG): container finished" podID="cc4ef050-7f47-4f1f-a62e-4607d290ddf3" containerID="bb40c4a6f0c2a86766c15c61aa71c2e0a232d03a6213edaff512421ed5a26b20" exitCode=0 Sep 30 17:55:06 crc kubenswrapper[4772]: I0930 17:55:06.677668 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" event={"ID":"cc4ef050-7f47-4f1f-a62e-4607d290ddf3","Type":"ContainerDied","Data":"bb40c4a6f0c2a86766c15c61aa71c2e0a232d03a6213edaff512421ed5a26b20"} Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.097710 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.270909 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ssh-key\") pod \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.271260 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-secret-0\") pod \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.271309 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-combined-ca-bundle\") pod \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.271344 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f8n9\" (UniqueName: \"kubernetes.io/projected/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-kube-api-access-5f8n9\") pod \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.271382 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-inventory\") pod \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.271534 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ceph\") pod \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\" (UID: \"cc4ef050-7f47-4f1f-a62e-4607d290ddf3\") " Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.276577 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "cc4ef050-7f47-4f1f-a62e-4607d290ddf3" (UID: "cc4ef050-7f47-4f1f-a62e-4607d290ddf3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.277834 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-kube-api-access-5f8n9" (OuterVolumeSpecName: "kube-api-access-5f8n9") pod "cc4ef050-7f47-4f1f-a62e-4607d290ddf3" (UID: "cc4ef050-7f47-4f1f-a62e-4607d290ddf3"). InnerVolumeSpecName "kube-api-access-5f8n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.278017 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ceph" (OuterVolumeSpecName: "ceph") pod "cc4ef050-7f47-4f1f-a62e-4607d290ddf3" (UID: "cc4ef050-7f47-4f1f-a62e-4607d290ddf3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.297304 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-inventory" (OuterVolumeSpecName: "inventory") pod "cc4ef050-7f47-4f1f-a62e-4607d290ddf3" (UID: "cc4ef050-7f47-4f1f-a62e-4607d290ddf3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.298522 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "cc4ef050-7f47-4f1f-a62e-4607d290ddf3" (UID: "cc4ef050-7f47-4f1f-a62e-4607d290ddf3"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.303754 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cc4ef050-7f47-4f1f-a62e-4607d290ddf3" (UID: "cc4ef050-7f47-4f1f-a62e-4607d290ddf3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.373412 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.373690 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.373784 4772 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.373870 4772 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.373956 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f8n9\" (UniqueName: \"kubernetes.io/projected/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-kube-api-access-5f8n9\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.374071 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc4ef050-7f47-4f1f-a62e-4607d290ddf3-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.697617 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" event={"ID":"cc4ef050-7f47-4f1f-a62e-4607d290ddf3","Type":"ContainerDied","Data":"e759ce77b6bca0bdf640bcbef1c8f6c67dead2aab91f2b0959b66c3888cf8086"} Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.697658 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e759ce77b6bca0bdf640bcbef1c8f6c67dead2aab91f2b0959b66c3888cf8086" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.697731 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.831411 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv"] Sep 30 17:55:08 crc kubenswrapper[4772]: E0930 17:55:08.831943 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4ef050-7f47-4f1f-a62e-4607d290ddf3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.831968 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4ef050-7f47-4f1f-a62e-4607d290ddf3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 17:55:08 crc kubenswrapper[4772]: E0930 17:55:08.832001 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerName="extract-utilities" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.832009 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerName="extract-utilities" Sep 30 17:55:08 crc kubenswrapper[4772]: E0930 17:55:08.832020 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerName="registry-server" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.832028 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerName="registry-server" Sep 30 17:55:08 crc kubenswrapper[4772]: E0930 17:55:08.832044 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerName="extract-content" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.832052 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerName="extract-content" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.832289 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4ef050-7f47-4f1f-a62e-4607d290ddf3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.832309 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="262f9cc7-e68f-4c93-a4ad-de613734fb74" containerName="registry-server" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.833223 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.835171 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.835692 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.835775 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.835920 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.836137 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.835942 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.835697 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.838014 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.839504 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.856563 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv"] Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.987154 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.987222 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.987265 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.987434 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.987484 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.987574 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.987944 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.988003 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfttz\" (UniqueName: \"kubernetes.io/projected/52dddfd0-5fcc-47be-96c2-e3427fc66069-kube-api-access-tfttz\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.988047 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.988183 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:08 crc kubenswrapper[4772]: I0930 17:55:08.988393 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.089940 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.089997 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.090026 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.090075 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.090168 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.090194 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.090703 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.090806 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.090846 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfttz\" (UniqueName: \"kubernetes.io/projected/52dddfd0-5fcc-47be-96c2-e3427fc66069-kube-api-access-tfttz\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.090893 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.090939 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.091317 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.092516 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.093885 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.093986 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.094551 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.095149 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.095214 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.095634 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.094859 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.107767 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.117293 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfttz\" (UniqueName: \"kubernetes.io/projected/52dddfd0-5fcc-47be-96c2-e3427fc66069-kube-api-access-tfttz\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.155720 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.660126 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv"] Sep 30 17:55:09 crc kubenswrapper[4772]: I0930 17:55:09.714908 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" event={"ID":"52dddfd0-5fcc-47be-96c2-e3427fc66069","Type":"ContainerStarted","Data":"b548246047ce833727ee221d2e0753144ab495985df71984e6e47daa40d8462d"} Sep 30 17:55:10 crc kubenswrapper[4772]: I0930 17:55:10.724514 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" event={"ID":"52dddfd0-5fcc-47be-96c2-e3427fc66069","Type":"ContainerStarted","Data":"06408bfcbff45d7872436e883b6661a436d1f8766822f5e01c282caccf167030"} Sep 30 17:55:10 crc kubenswrapper[4772]: I0930 17:55:10.750643 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" podStartSLOduration=2.232761364 podStartE2EDuration="2.750623238s" podCreationTimestamp="2025-09-30 17:55:08 +0000 UTC" firstStartedPulling="2025-09-30 17:55:09.671744748 +0000 UTC m=+3210.578757579" lastFinishedPulling="2025-09-30 17:55:10.189606612 +0000 UTC m=+3211.096619453" observedRunningTime="2025-09-30 17:55:10.749502688 +0000 UTC m=+3211.656515519" watchObservedRunningTime="2025-09-30 17:55:10.750623238 +0000 UTC m=+3211.657636069" Sep 30 17:55:13 crc kubenswrapper[4772]: I0930 17:55:13.898267 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:55:13 crc kubenswrapper[4772]: E0930 17:55:13.899113 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:55:26 crc kubenswrapper[4772]: I0930 17:55:26.897974 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:55:26 crc kubenswrapper[4772]: E0930 17:55:26.899405 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.549759 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wtvkz"] Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.552878 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.560494 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wtvkz"] Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.677239 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-utilities\") pod \"certified-operators-wtvkz\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.677320 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f66cg\" (UniqueName: \"kubernetes.io/projected/2519098d-7bba-4efb-8e7a-5e71e0cab314-kube-api-access-f66cg\") pod \"certified-operators-wtvkz\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.677371 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-catalog-content\") pod \"certified-operators-wtvkz\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.778747 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f66cg\" (UniqueName: \"kubernetes.io/projected/2519098d-7bba-4efb-8e7a-5e71e0cab314-kube-api-access-f66cg\") pod \"certified-operators-wtvkz\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.778839 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-catalog-content\") pod \"certified-operators-wtvkz\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.778933 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-utilities\") pod \"certified-operators-wtvkz\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.779480 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-utilities\") pod \"certified-operators-wtvkz\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.779563 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-catalog-content\") pod \"certified-operators-wtvkz\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.805220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f66cg\" (UniqueName: \"kubernetes.io/projected/2519098d-7bba-4efb-8e7a-5e71e0cab314-kube-api-access-f66cg\") pod \"certified-operators-wtvkz\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:27 crc kubenswrapper[4772]: I0930 17:55:27.879249 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:28 crc kubenswrapper[4772]: I0930 17:55:28.376026 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wtvkz"] Sep 30 17:55:28 crc kubenswrapper[4772]: I0930 17:55:28.882794 4772 generic.go:334] "Generic (PLEG): container finished" podID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerID="b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff" exitCode=0 Sep 30 17:55:28 crc kubenswrapper[4772]: I0930 17:55:28.882841 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtvkz" event={"ID":"2519098d-7bba-4efb-8e7a-5e71e0cab314","Type":"ContainerDied","Data":"b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff"} Sep 30 17:55:28 crc kubenswrapper[4772]: I0930 17:55:28.883145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtvkz" event={"ID":"2519098d-7bba-4efb-8e7a-5e71e0cab314","Type":"ContainerStarted","Data":"6a5d0464f36a1b624701440414d09f26156dc3e2fceaaaeeca303957ca1b8a6d"} Sep 30 17:55:29 crc kubenswrapper[4772]: I0930 17:55:29.894013 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtvkz" event={"ID":"2519098d-7bba-4efb-8e7a-5e71e0cab314","Type":"ContainerStarted","Data":"1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316"} Sep 30 17:55:30 crc kubenswrapper[4772]: I0930 17:55:30.903365 4772 generic.go:334] "Generic (PLEG): container finished" podID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerID="1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316" exitCode=0 Sep 30 17:55:30 crc kubenswrapper[4772]: I0930 17:55:30.903416 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtvkz" event={"ID":"2519098d-7bba-4efb-8e7a-5e71e0cab314","Type":"ContainerDied","Data":"1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316"} Sep 30 17:55:31 crc kubenswrapper[4772]: I0930 17:55:31.913692 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtvkz" event={"ID":"2519098d-7bba-4efb-8e7a-5e71e0cab314","Type":"ContainerStarted","Data":"1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42"} Sep 30 17:55:31 crc kubenswrapper[4772]: I0930 17:55:31.936919 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wtvkz" podStartSLOduration=2.519616191 podStartE2EDuration="4.936899903s" podCreationTimestamp="2025-09-30 17:55:27 +0000 UTC" firstStartedPulling="2025-09-30 17:55:28.884677171 +0000 UTC m=+3229.791690002" lastFinishedPulling="2025-09-30 17:55:31.301960883 +0000 UTC m=+3232.208973714" observedRunningTime="2025-09-30 17:55:31.936690597 +0000 UTC m=+3232.843703428" watchObservedRunningTime="2025-09-30 17:55:31.936899903 +0000 UTC m=+3232.843912734" Sep 30 17:55:37 crc kubenswrapper[4772]: I0930 17:55:37.879813 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:37 crc kubenswrapper[4772]: I0930 17:55:37.880418 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:37 crc kubenswrapper[4772]: I0930 17:55:37.935022 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:38 crc kubenswrapper[4772]: I0930 17:55:38.015579 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:41 crc kubenswrapper[4772]: I0930 17:55:41.135306 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wtvkz"] Sep 30 17:55:41 crc kubenswrapper[4772]: I0930 17:55:41.135851 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wtvkz" podUID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerName="registry-server" containerID="cri-o://1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42" gracePeriod=2 Sep 30 17:55:41 crc kubenswrapper[4772]: I0930 17:55:41.899213 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:55:41 crc kubenswrapper[4772]: E0930 17:55:41.900135 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:55:41 crc kubenswrapper[4772]: I0930 17:55:41.947531 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:41 crc kubenswrapper[4772]: I0930 17:55:41.961904 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f66cg\" (UniqueName: \"kubernetes.io/projected/2519098d-7bba-4efb-8e7a-5e71e0cab314-kube-api-access-f66cg\") pod \"2519098d-7bba-4efb-8e7a-5e71e0cab314\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " Sep 30 17:55:41 crc kubenswrapper[4772]: I0930 17:55:41.962172 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-utilities\") pod \"2519098d-7bba-4efb-8e7a-5e71e0cab314\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " Sep 30 17:55:41 crc kubenswrapper[4772]: I0930 17:55:41.962320 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-catalog-content\") pod \"2519098d-7bba-4efb-8e7a-5e71e0cab314\" (UID: \"2519098d-7bba-4efb-8e7a-5e71e0cab314\") " Sep 30 17:55:41 crc kubenswrapper[4772]: I0930 17:55:41.964698 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-utilities" (OuterVolumeSpecName: "utilities") pod "2519098d-7bba-4efb-8e7a-5e71e0cab314" (UID: "2519098d-7bba-4efb-8e7a-5e71e0cab314"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:55:41 crc kubenswrapper[4772]: I0930 17:55:41.994442 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2519098d-7bba-4efb-8e7a-5e71e0cab314-kube-api-access-f66cg" (OuterVolumeSpecName: "kube-api-access-f66cg") pod "2519098d-7bba-4efb-8e7a-5e71e0cab314" (UID: "2519098d-7bba-4efb-8e7a-5e71e0cab314"). InnerVolumeSpecName "kube-api-access-f66cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.007767 4772 generic.go:334] "Generic (PLEG): container finished" podID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerID="1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42" exitCode=0 Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.007834 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtvkz" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.007849 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtvkz" event={"ID":"2519098d-7bba-4efb-8e7a-5e71e0cab314","Type":"ContainerDied","Data":"1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42"} Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.007922 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtvkz" event={"ID":"2519098d-7bba-4efb-8e7a-5e71e0cab314","Type":"ContainerDied","Data":"6a5d0464f36a1b624701440414d09f26156dc3e2fceaaaeeca303957ca1b8a6d"} Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.007964 4772 scope.go:117] "RemoveContainer" containerID="1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.034393 4772 scope.go:117] "RemoveContainer" containerID="1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.044596 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2519098d-7bba-4efb-8e7a-5e71e0cab314" (UID: "2519098d-7bba-4efb-8e7a-5e71e0cab314"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.055842 4772 scope.go:117] "RemoveContainer" containerID="b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.065669 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.065733 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f66cg\" (UniqueName: \"kubernetes.io/projected/2519098d-7bba-4efb-8e7a-5e71e0cab314-kube-api-access-f66cg\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.065749 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2519098d-7bba-4efb-8e7a-5e71e0cab314-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.118889 4772 scope.go:117] "RemoveContainer" containerID="1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42" Sep 30 17:55:42 crc kubenswrapper[4772]: E0930 17:55:42.119342 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42\": container with ID starting with 1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42 not found: ID does not exist" containerID="1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.119378 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42"} err="failed to get container status \"1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42\": rpc error: code = NotFound desc = could not find container \"1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42\": container with ID starting with 1a311c752c391bd30c1ba7cdb36a673b871c0463892d63801bf5c0da4b671d42 not found: ID does not exist" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.119404 4772 scope.go:117] "RemoveContainer" containerID="1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316" Sep 30 17:55:42 crc kubenswrapper[4772]: E0930 17:55:42.119776 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316\": container with ID starting with 1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316 not found: ID does not exist" containerID="1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.119801 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316"} err="failed to get container status \"1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316\": rpc error: code = NotFound desc = could not find container \"1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316\": container with ID starting with 1b24a714abc4a88baaac2ea4935aab42f6a1b4b4fef444711b1cf9ca67e39316 not found: ID does not exist" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.119819 4772 scope.go:117] "RemoveContainer" containerID="b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff" Sep 30 17:55:42 crc kubenswrapper[4772]: E0930 17:55:42.120169 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff\": container with ID starting with b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff not found: ID does not exist" containerID="b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.120196 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff"} err="failed to get container status \"b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff\": rpc error: code = NotFound desc = could not find container \"b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff\": container with ID starting with b8524ca6989a6816f6ed1ff03c2a8e090d57b8318dd88f5ae3bf59bfe83ba1ff not found: ID does not exist" Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.337007 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wtvkz"] Sep 30 17:55:42 crc kubenswrapper[4772]: I0930 17:55:42.345147 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wtvkz"] Sep 30 17:55:43 crc kubenswrapper[4772]: I0930 17:55:43.910403 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2519098d-7bba-4efb-8e7a-5e71e0cab314" path="/var/lib/kubelet/pods/2519098d-7bba-4efb-8e7a-5e71e0cab314/volumes" Sep 30 17:55:56 crc kubenswrapper[4772]: I0930 17:55:56.899508 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:55:56 crc kubenswrapper[4772]: E0930 17:55:56.900444 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:56:11 crc kubenswrapper[4772]: I0930 17:56:11.899301 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:56:11 crc kubenswrapper[4772]: E0930 17:56:11.900876 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:56:22 crc kubenswrapper[4772]: I0930 17:56:22.898944 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:56:22 crc kubenswrapper[4772]: E0930 17:56:22.899812 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:56:34 crc kubenswrapper[4772]: I0930 17:56:34.898586 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:56:34 crc kubenswrapper[4772]: E0930 17:56:34.899486 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 17:56:47 crc kubenswrapper[4772]: I0930 17:56:47.899925 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 17:56:48 crc kubenswrapper[4772]: I0930 17:56:48.603159 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"3833c2385cfa1ee8eb7c08c4dcf01f6d652b485c7a29505227f3ab3c212e162a"} Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.119777 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f6rhl"] Sep 30 17:57:24 crc kubenswrapper[4772]: E0930 17:57:24.120894 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerName="registry-server" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.120911 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerName="registry-server" Sep 30 17:57:24 crc kubenswrapper[4772]: E0930 17:57:24.120932 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerName="extract-content" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.120938 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerName="extract-content" Sep 30 17:57:24 crc kubenswrapper[4772]: E0930 17:57:24.120946 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerName="extract-utilities" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.120953 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerName="extract-utilities" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.121191 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2519098d-7bba-4efb-8e7a-5e71e0cab314" containerName="registry-server" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.122627 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.130718 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f6rhl"] Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.301317 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drl9r\" (UniqueName: \"kubernetes.io/projected/17a7eb39-b409-4556-8be2-aecdb3aae575-kube-api-access-drl9r\") pod \"community-operators-f6rhl\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.301750 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-utilities\") pod \"community-operators-f6rhl\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.301788 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-catalog-content\") pod \"community-operators-f6rhl\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.403681 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drl9r\" (UniqueName: \"kubernetes.io/projected/17a7eb39-b409-4556-8be2-aecdb3aae575-kube-api-access-drl9r\") pod \"community-operators-f6rhl\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.403906 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-utilities\") pod \"community-operators-f6rhl\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.403957 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-catalog-content\") pod \"community-operators-f6rhl\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.404456 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-catalog-content\") pod \"community-operators-f6rhl\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.405233 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-utilities\") pod \"community-operators-f6rhl\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.421182 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drl9r\" (UniqueName: \"kubernetes.io/projected/17a7eb39-b409-4556-8be2-aecdb3aae575-kube-api-access-drl9r\") pod \"community-operators-f6rhl\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.447802 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:24 crc kubenswrapper[4772]: I0930 17:57:24.906807 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f6rhl"] Sep 30 17:57:25 crc kubenswrapper[4772]: I0930 17:57:25.035326 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rhl" event={"ID":"17a7eb39-b409-4556-8be2-aecdb3aae575","Type":"ContainerStarted","Data":"8327b1aa8f478594f8b6fbc99bf56f27fb3c93ebc28ecc2d242747910399bcd4"} Sep 30 17:57:26 crc kubenswrapper[4772]: I0930 17:57:26.049303 4772 generic.go:334] "Generic (PLEG): container finished" podID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerID="5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c" exitCode=0 Sep 30 17:57:26 crc kubenswrapper[4772]: I0930 17:57:26.049408 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rhl" event={"ID":"17a7eb39-b409-4556-8be2-aecdb3aae575","Type":"ContainerDied","Data":"5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c"} Sep 30 17:57:26 crc kubenswrapper[4772]: I0930 17:57:26.051883 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:57:27 crc kubenswrapper[4772]: I0930 17:57:27.059927 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rhl" event={"ID":"17a7eb39-b409-4556-8be2-aecdb3aae575","Type":"ContainerStarted","Data":"c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2"} Sep 30 17:57:28 crc kubenswrapper[4772]: I0930 17:57:28.072737 4772 generic.go:334] "Generic (PLEG): container finished" podID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerID="c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2" exitCode=0 Sep 30 17:57:28 crc kubenswrapper[4772]: I0930 17:57:28.072836 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rhl" event={"ID":"17a7eb39-b409-4556-8be2-aecdb3aae575","Type":"ContainerDied","Data":"c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2"} Sep 30 17:57:28 crc kubenswrapper[4772]: I0930 17:57:28.073266 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rhl" event={"ID":"17a7eb39-b409-4556-8be2-aecdb3aae575","Type":"ContainerStarted","Data":"3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e"} Sep 30 17:57:28 crc kubenswrapper[4772]: I0930 17:57:28.096117 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f6rhl" podStartSLOduration=2.673580237 podStartE2EDuration="4.096092252s" podCreationTimestamp="2025-09-30 17:57:24 +0000 UTC" firstStartedPulling="2025-09-30 17:57:26.051656448 +0000 UTC m=+3346.958669279" lastFinishedPulling="2025-09-30 17:57:27.474168463 +0000 UTC m=+3348.381181294" observedRunningTime="2025-09-30 17:57:28.088584953 +0000 UTC m=+3348.995597794" watchObservedRunningTime="2025-09-30 17:57:28.096092252 +0000 UTC m=+3349.003105083" Sep 30 17:57:34 crc kubenswrapper[4772]: I0930 17:57:34.448196 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:34 crc kubenswrapper[4772]: I0930 17:57:34.449311 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:34 crc kubenswrapper[4772]: I0930 17:57:34.499126 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:35 crc kubenswrapper[4772]: I0930 17:57:35.177783 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:35 crc kubenswrapper[4772]: I0930 17:57:35.232495 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f6rhl"] Sep 30 17:57:37 crc kubenswrapper[4772]: I0930 17:57:37.147665 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f6rhl" podUID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerName="registry-server" containerID="cri-o://3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e" gracePeriod=2 Sep 30 17:57:37 crc kubenswrapper[4772]: I0930 17:57:37.597470 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:37 crc kubenswrapper[4772]: I0930 17:57:37.693820 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-utilities\") pod \"17a7eb39-b409-4556-8be2-aecdb3aae575\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " Sep 30 17:57:37 crc kubenswrapper[4772]: I0930 17:57:37.694154 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drl9r\" (UniqueName: \"kubernetes.io/projected/17a7eb39-b409-4556-8be2-aecdb3aae575-kube-api-access-drl9r\") pod \"17a7eb39-b409-4556-8be2-aecdb3aae575\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " Sep 30 17:57:37 crc kubenswrapper[4772]: I0930 17:57:37.694298 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-catalog-content\") pod \"17a7eb39-b409-4556-8be2-aecdb3aae575\" (UID: \"17a7eb39-b409-4556-8be2-aecdb3aae575\") " Sep 30 17:57:37 crc kubenswrapper[4772]: I0930 17:57:37.694840 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-utilities" (OuterVolumeSpecName: "utilities") pod "17a7eb39-b409-4556-8be2-aecdb3aae575" (UID: "17a7eb39-b409-4556-8be2-aecdb3aae575"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:57:37 crc kubenswrapper[4772]: I0930 17:57:37.702332 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a7eb39-b409-4556-8be2-aecdb3aae575-kube-api-access-drl9r" (OuterVolumeSpecName: "kube-api-access-drl9r") pod "17a7eb39-b409-4556-8be2-aecdb3aae575" (UID: "17a7eb39-b409-4556-8be2-aecdb3aae575"). InnerVolumeSpecName "kube-api-access-drl9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:57:37 crc kubenswrapper[4772]: I0930 17:57:37.797073 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drl9r\" (UniqueName: \"kubernetes.io/projected/17a7eb39-b409-4556-8be2-aecdb3aae575-kube-api-access-drl9r\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:37 crc kubenswrapper[4772]: I0930 17:57:37.797110 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.169369 4772 generic.go:334] "Generic (PLEG): container finished" podID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerID="3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e" exitCode=0 Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.169422 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rhl" event={"ID":"17a7eb39-b409-4556-8be2-aecdb3aae575","Type":"ContainerDied","Data":"3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e"} Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.169464 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rhl" event={"ID":"17a7eb39-b409-4556-8be2-aecdb3aae575","Type":"ContainerDied","Data":"8327b1aa8f478594f8b6fbc99bf56f27fb3c93ebc28ecc2d242747910399bcd4"} Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.169482 4772 scope.go:117] "RemoveContainer" containerID="3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.169689 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6rhl" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.177561 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17a7eb39-b409-4556-8be2-aecdb3aae575" (UID: "17a7eb39-b409-4556-8be2-aecdb3aae575"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.205035 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17a7eb39-b409-4556-8be2-aecdb3aae575-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.207006 4772 scope.go:117] "RemoveContainer" containerID="c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.235558 4772 scope.go:117] "RemoveContainer" containerID="5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.272529 4772 scope.go:117] "RemoveContainer" containerID="3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e" Sep 30 17:57:38 crc kubenswrapper[4772]: E0930 17:57:38.273152 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e\": container with ID starting with 3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e not found: ID does not exist" containerID="3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.273199 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e"} err="failed to get container status \"3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e\": rpc error: code = NotFound desc = could not find container \"3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e\": container with ID starting with 3aeb9f515400e4e08995d89e754b7e026df76b9c963d795d97d27b030c627b0e not found: ID does not exist" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.273227 4772 scope.go:117] "RemoveContainer" containerID="c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2" Sep 30 17:57:38 crc kubenswrapper[4772]: E0930 17:57:38.273539 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2\": container with ID starting with c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2 not found: ID does not exist" containerID="c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.273591 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2"} err="failed to get container status \"c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2\": rpc error: code = NotFound desc = could not find container \"c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2\": container with ID starting with c0a0ab08a643707f6c144704200e9d6dc349473eb980742e598a0d38f8a722c2 not found: ID does not exist" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.273659 4772 scope.go:117] "RemoveContainer" containerID="5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c" Sep 30 17:57:38 crc kubenswrapper[4772]: E0930 17:57:38.274045 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c\": container with ID starting with 5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c not found: ID does not exist" containerID="5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.274137 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c"} err="failed to get container status \"5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c\": rpc error: code = NotFound desc = could not find container \"5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c\": container with ID starting with 5d832a19a617b55b1ea22d79a8438042066d43e4829b9089fb21836ea79fc84c not found: ID does not exist" Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.515087 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f6rhl"] Sep 30 17:57:38 crc kubenswrapper[4772]: I0930 17:57:38.529777 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f6rhl"] Sep 30 17:57:39 crc kubenswrapper[4772]: I0930 17:57:39.917887 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a7eb39-b409-4556-8be2-aecdb3aae575" path="/var/lib/kubelet/pods/17a7eb39-b409-4556-8be2-aecdb3aae575/volumes" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.305859 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7hpgf"] Sep 30 17:58:07 crc kubenswrapper[4772]: E0930 17:58:07.306820 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerName="registry-server" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.306867 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerName="registry-server" Sep 30 17:58:07 crc kubenswrapper[4772]: E0930 17:58:07.306931 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerName="extract-utilities" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.306941 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerName="extract-utilities" Sep 30 17:58:07 crc kubenswrapper[4772]: E0930 17:58:07.306960 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerName="extract-content" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.306968 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerName="extract-content" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.307211 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a7eb39-b409-4556-8be2-aecdb3aae575" containerName="registry-server" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.309817 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.319523 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hpgf"] Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.508209 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9xg2\" (UniqueName: \"kubernetes.io/projected/6bb8cb54-3988-4dc1-bfad-aebfe0949017-kube-api-access-s9xg2\") pod \"redhat-marketplace-7hpgf\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.508556 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-catalog-content\") pod \"redhat-marketplace-7hpgf\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.508581 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-utilities\") pod \"redhat-marketplace-7hpgf\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.610495 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9xg2\" (UniqueName: \"kubernetes.io/projected/6bb8cb54-3988-4dc1-bfad-aebfe0949017-kube-api-access-s9xg2\") pod \"redhat-marketplace-7hpgf\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.610976 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-catalog-content\") pod \"redhat-marketplace-7hpgf\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.611004 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-utilities\") pod \"redhat-marketplace-7hpgf\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.611527 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-catalog-content\") pod \"redhat-marketplace-7hpgf\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.611568 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-utilities\") pod \"redhat-marketplace-7hpgf\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.632396 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9xg2\" (UniqueName: \"kubernetes.io/projected/6bb8cb54-3988-4dc1-bfad-aebfe0949017-kube-api-access-s9xg2\") pod \"redhat-marketplace-7hpgf\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:07 crc kubenswrapper[4772]: I0930 17:58:07.636128 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:08 crc kubenswrapper[4772]: I0930 17:58:08.123526 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hpgf"] Sep 30 17:58:08 crc kubenswrapper[4772]: I0930 17:58:08.428433 4772 generic.go:334] "Generic (PLEG): container finished" podID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerID="ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172" exitCode=0 Sep 30 17:58:08 crc kubenswrapper[4772]: I0930 17:58:08.428485 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hpgf" event={"ID":"6bb8cb54-3988-4dc1-bfad-aebfe0949017","Type":"ContainerDied","Data":"ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172"} Sep 30 17:58:08 crc kubenswrapper[4772]: I0930 17:58:08.428799 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hpgf" event={"ID":"6bb8cb54-3988-4dc1-bfad-aebfe0949017","Type":"ContainerStarted","Data":"83ea5c8547e0482b1acabfcc5341039dc6fd1c57ba56c0b849559453b29f2046"} Sep 30 17:58:09 crc kubenswrapper[4772]: I0930 17:58:09.442368 4772 generic.go:334] "Generic (PLEG): container finished" podID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerID="dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079" exitCode=0 Sep 30 17:58:09 crc kubenswrapper[4772]: I0930 17:58:09.442514 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hpgf" event={"ID":"6bb8cb54-3988-4dc1-bfad-aebfe0949017","Type":"ContainerDied","Data":"dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079"} Sep 30 17:58:10 crc kubenswrapper[4772]: I0930 17:58:10.453631 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hpgf" event={"ID":"6bb8cb54-3988-4dc1-bfad-aebfe0949017","Type":"ContainerStarted","Data":"275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff"} Sep 30 17:58:10 crc kubenswrapper[4772]: I0930 17:58:10.484080 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7hpgf" podStartSLOduration=1.926663305 podStartE2EDuration="3.484047728s" podCreationTimestamp="2025-09-30 17:58:07 +0000 UTC" firstStartedPulling="2025-09-30 17:58:08.429954367 +0000 UTC m=+3389.336967198" lastFinishedPulling="2025-09-30 17:58:09.98733879 +0000 UTC m=+3390.894351621" observedRunningTime="2025-09-30 17:58:10.478153422 +0000 UTC m=+3391.385166253" watchObservedRunningTime="2025-09-30 17:58:10.484047728 +0000 UTC m=+3391.391060559" Sep 30 17:58:17 crc kubenswrapper[4772]: I0930 17:58:17.637366 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:17 crc kubenswrapper[4772]: I0930 17:58:17.637996 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:17 crc kubenswrapper[4772]: I0930 17:58:17.686162 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:18 crc kubenswrapper[4772]: I0930 17:58:18.584548 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:18 crc kubenswrapper[4772]: I0930 17:58:18.642422 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hpgf"] Sep 30 17:58:20 crc kubenswrapper[4772]: I0930 17:58:20.558917 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7hpgf" podUID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerName="registry-server" containerID="cri-o://275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff" gracePeriod=2 Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.082759 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.191057 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-catalog-content\") pod \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.191319 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9xg2\" (UniqueName: \"kubernetes.io/projected/6bb8cb54-3988-4dc1-bfad-aebfe0949017-kube-api-access-s9xg2\") pod \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.191352 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-utilities\") pod \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\" (UID: \"6bb8cb54-3988-4dc1-bfad-aebfe0949017\") " Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.192292 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-utilities" (OuterVolumeSpecName: "utilities") pod "6bb8cb54-3988-4dc1-bfad-aebfe0949017" (UID: "6bb8cb54-3988-4dc1-bfad-aebfe0949017"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.197361 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb8cb54-3988-4dc1-bfad-aebfe0949017-kube-api-access-s9xg2" (OuterVolumeSpecName: "kube-api-access-s9xg2") pod "6bb8cb54-3988-4dc1-bfad-aebfe0949017" (UID: "6bb8cb54-3988-4dc1-bfad-aebfe0949017"). InnerVolumeSpecName "kube-api-access-s9xg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.205149 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bb8cb54-3988-4dc1-bfad-aebfe0949017" (UID: "6bb8cb54-3988-4dc1-bfad-aebfe0949017"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.294380 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.294423 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9xg2\" (UniqueName: \"kubernetes.io/projected/6bb8cb54-3988-4dc1-bfad-aebfe0949017-kube-api-access-s9xg2\") on node \"crc\" DevicePath \"\"" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.294438 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb8cb54-3988-4dc1-bfad-aebfe0949017-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.568254 4772 generic.go:334] "Generic (PLEG): container finished" podID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerID="275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff" exitCode=0 Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.568307 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hpgf" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.568310 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hpgf" event={"ID":"6bb8cb54-3988-4dc1-bfad-aebfe0949017","Type":"ContainerDied","Data":"275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff"} Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.568407 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hpgf" event={"ID":"6bb8cb54-3988-4dc1-bfad-aebfe0949017","Type":"ContainerDied","Data":"83ea5c8547e0482b1acabfcc5341039dc6fd1c57ba56c0b849559453b29f2046"} Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.568424 4772 scope.go:117] "RemoveContainer" containerID="275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.599762 4772 scope.go:117] "RemoveContainer" containerID="dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.603355 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hpgf"] Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.613384 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hpgf"] Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.628174 4772 scope.go:117] "RemoveContainer" containerID="ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.696253 4772 scope.go:117] "RemoveContainer" containerID="275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff" Sep 30 17:58:21 crc kubenswrapper[4772]: E0930 17:58:21.697182 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff\": container with ID starting with 275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff not found: ID does not exist" containerID="275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.697229 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff"} err="failed to get container status \"275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff\": rpc error: code = NotFound desc = could not find container \"275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff\": container with ID starting with 275704afd97345c90c244f9f93a1df89bfb6f054d2cf298ff0f17078207629ff not found: ID does not exist" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.697262 4772 scope.go:117] "RemoveContainer" containerID="dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079" Sep 30 17:58:21 crc kubenswrapper[4772]: E0930 17:58:21.698502 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079\": container with ID starting with dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079 not found: ID does not exist" containerID="dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.698563 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079"} err="failed to get container status \"dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079\": rpc error: code = NotFound desc = could not find container \"dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079\": container with ID starting with dcb38f52d9d2824d33fa6d767c57eed7d63db85b73f0f25e8639fb1fdc238079 not found: ID does not exist" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.698601 4772 scope.go:117] "RemoveContainer" containerID="ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172" Sep 30 17:58:21 crc kubenswrapper[4772]: E0930 17:58:21.699098 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172\": container with ID starting with ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172 not found: ID does not exist" containerID="ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.699144 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172"} err="failed to get container status \"ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172\": rpc error: code = NotFound desc = could not find container \"ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172\": container with ID starting with ccbc886626f054e1f2a1757e1064b2e4e97f8cd929cbedb875ed49ea8cbb8172 not found: ID does not exist" Sep 30 17:58:21 crc kubenswrapper[4772]: I0930 17:58:21.912403 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" path="/var/lib/kubelet/pods/6bb8cb54-3988-4dc1-bfad-aebfe0949017/volumes" Sep 30 17:59:08 crc kubenswrapper[4772]: I0930 17:59:08.655591 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:59:08 crc kubenswrapper[4772]: I0930 17:59:08.656135 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:59:13 crc kubenswrapper[4772]: I0930 17:59:13.019839 4772 generic.go:334] "Generic (PLEG): container finished" podID="52dddfd0-5fcc-47be-96c2-e3427fc66069" containerID="06408bfcbff45d7872436e883b6661a436d1f8766822f5e01c282caccf167030" exitCode=0 Sep 30 17:59:13 crc kubenswrapper[4772]: I0930 17:59:13.019951 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" event={"ID":"52dddfd0-5fcc-47be-96c2-e3427fc66069","Type":"ContainerDied","Data":"06408bfcbff45d7872436e883b6661a436d1f8766822f5e01c282caccf167030"} Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.455205 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.650046 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-1\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.650114 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.650216 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-1\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.650244 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-inventory\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.650300 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfttz\" (UniqueName: \"kubernetes.io/projected/52dddfd0-5fcc-47be-96c2-e3427fc66069-kube-api-access-tfttz\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.650353 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph-nova-0\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.651046 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-extra-config-0\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.651144 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-0\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.651174 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-0\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.651214 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ssh-key\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.651273 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-custom-ceph-combined-ca-bundle\") pod \"52dddfd0-5fcc-47be-96c2-e3427fc66069\" (UID: \"52dddfd0-5fcc-47be-96c2-e3427fc66069\") " Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.656288 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph" (OuterVolumeSpecName: "ceph") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.657449 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.673130 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52dddfd0-5fcc-47be-96c2-e3427fc66069-kube-api-access-tfttz" (OuterVolumeSpecName: "kube-api-access-tfttz") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "kube-api-access-tfttz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.682510 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.685106 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.689877 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.692550 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.692587 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.695423 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.696002 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-inventory" (OuterVolumeSpecName: "inventory") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.705473 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "52dddfd0-5fcc-47be-96c2-e3427fc66069" (UID: "52dddfd0-5fcc-47be-96c2-e3427fc66069"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.754474 4772 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.754729 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.754816 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfttz\" (UniqueName: \"kubernetes.io/projected/52dddfd0-5fcc-47be-96c2-e3427fc66069-kube-api-access-tfttz\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.754939 4772 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.755025 4772 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.755129 4772 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.755214 4772 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.755289 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.755413 4772 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.755501 4772 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:14 crc kubenswrapper[4772]: I0930 17:59:14.755604 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52dddfd0-5fcc-47be-96c2-e3427fc66069-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.041877 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" event={"ID":"52dddfd0-5fcc-47be-96c2-e3427fc66069","Type":"ContainerDied","Data":"b548246047ce833727ee221d2e0753144ab495985df71984e6e47daa40d8462d"} Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.041942 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b548246047ce833727ee221d2e0753144ab495985df71984e6e47daa40d8462d" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.041924 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.152661 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql"] Sep 30 17:59:15 crc kubenswrapper[4772]: E0930 17:59:15.153121 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerName="extract-utilities" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.153143 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerName="extract-utilities" Sep 30 17:59:15 crc kubenswrapper[4772]: E0930 17:59:15.153170 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerName="extract-content" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.153181 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerName="extract-content" Sep 30 17:59:15 crc kubenswrapper[4772]: E0930 17:59:15.153188 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerName="registry-server" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.153196 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerName="registry-server" Sep 30 17:59:15 crc kubenswrapper[4772]: E0930 17:59:15.153213 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52dddfd0-5fcc-47be-96c2-e3427fc66069" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.153224 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="52dddfd0-5fcc-47be-96c2-e3427fc66069" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.153412 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb8cb54-3988-4dc1-bfad-aebfe0949017" containerName="registry-server" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.153448 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="52dddfd0-5fcc-47be-96c2-e3427fc66069" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.154316 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.157696 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.157947 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.158046 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.158314 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-98pz9" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.158462 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.158974 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.163958 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql"] Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.265484 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.265546 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.265585 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.265714 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.265903 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.265940 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.266144 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.266219 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnx8t\" (UniqueName: \"kubernetes.io/projected/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-kube-api-access-rnx8t\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.367823 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.367885 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.367922 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.368016 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.368108 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.368136 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.368223 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.368266 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnx8t\" (UniqueName: \"kubernetes.io/projected/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-kube-api-access-rnx8t\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.372704 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.378711 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.379150 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.379149 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.379241 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.379248 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.379940 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.386249 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnx8t\" (UniqueName: \"kubernetes.io/projected/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-kube-api-access-rnx8t\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k2jql\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:15 crc kubenswrapper[4772]: I0930 17:59:15.481378 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 17:59:16 crc kubenswrapper[4772]: I0930 17:59:16.019790 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql"] Sep 30 17:59:16 crc kubenswrapper[4772]: I0930 17:59:16.052834 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" event={"ID":"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5","Type":"ContainerStarted","Data":"5c9573bf4d93d635fa49f17c35ada3b9a1def21f436576fc2c6688ab48304000"} Sep 30 17:59:17 crc kubenswrapper[4772]: I0930 17:59:17.066470 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" event={"ID":"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5","Type":"ContainerStarted","Data":"c80581a06b0ce2a648a77ba5c5227766985e3bee424e72cb0b7733f761667221"} Sep 30 17:59:17 crc kubenswrapper[4772]: I0930 17:59:17.088939 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" podStartSLOduration=1.668653544 podStartE2EDuration="2.08891202s" podCreationTimestamp="2025-09-30 17:59:15 +0000 UTC" firstStartedPulling="2025-09-30 17:59:16.025580345 +0000 UTC m=+3456.932593176" lastFinishedPulling="2025-09-30 17:59:16.445838821 +0000 UTC m=+3457.352851652" observedRunningTime="2025-09-30 17:59:17.0855006 +0000 UTC m=+3457.992513431" watchObservedRunningTime="2025-09-30 17:59:17.08891202 +0000 UTC m=+3457.995924851" Sep 30 17:59:38 crc kubenswrapper[4772]: I0930 17:59:38.655589 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:59:38 crc kubenswrapper[4772]: I0930 17:59:38.656225 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.168235 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft"] Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.171604 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.174381 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.174413 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.182178 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft"] Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.207984 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c10a89ba-f623-46aa-88c6-a37a9bbf0052-config-volume\") pod \"collect-profiles-29320920-b4fft\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.208150 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c10a89ba-f623-46aa-88c6-a37a9bbf0052-secret-volume\") pod \"collect-profiles-29320920-b4fft\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.208596 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lbnh\" (UniqueName: \"kubernetes.io/projected/c10a89ba-f623-46aa-88c6-a37a9bbf0052-kube-api-access-4lbnh\") pod \"collect-profiles-29320920-b4fft\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.310802 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c10a89ba-f623-46aa-88c6-a37a9bbf0052-config-volume\") pod \"collect-profiles-29320920-b4fft\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.310956 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c10a89ba-f623-46aa-88c6-a37a9bbf0052-secret-volume\") pod \"collect-profiles-29320920-b4fft\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.311099 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lbnh\" (UniqueName: \"kubernetes.io/projected/c10a89ba-f623-46aa-88c6-a37a9bbf0052-kube-api-access-4lbnh\") pod \"collect-profiles-29320920-b4fft\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.312133 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c10a89ba-f623-46aa-88c6-a37a9bbf0052-config-volume\") pod \"collect-profiles-29320920-b4fft\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.319670 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c10a89ba-f623-46aa-88c6-a37a9bbf0052-secret-volume\") pod \"collect-profiles-29320920-b4fft\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.332802 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lbnh\" (UniqueName: \"kubernetes.io/projected/c10a89ba-f623-46aa-88c6-a37a9bbf0052-kube-api-access-4lbnh\") pod \"collect-profiles-29320920-b4fft\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.503968 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:00 crc kubenswrapper[4772]: I0930 18:00:00.994954 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft"] Sep 30 18:00:01 crc kubenswrapper[4772]: I0930 18:00:01.479592 4772 generic.go:334] "Generic (PLEG): container finished" podID="c10a89ba-f623-46aa-88c6-a37a9bbf0052" containerID="e63e98edcdaeccbb469b4122dc158f45068de02fc54c6595b5a2f81b52b34543" exitCode=0 Sep 30 18:00:01 crc kubenswrapper[4772]: I0930 18:00:01.479660 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" event={"ID":"c10a89ba-f623-46aa-88c6-a37a9bbf0052","Type":"ContainerDied","Data":"e63e98edcdaeccbb469b4122dc158f45068de02fc54c6595b5a2f81b52b34543"} Sep 30 18:00:01 crc kubenswrapper[4772]: I0930 18:00:01.479923 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" event={"ID":"c10a89ba-f623-46aa-88c6-a37a9bbf0052","Type":"ContainerStarted","Data":"fa62946e4431e682f3b51775c04f7c638bc6287e1e73e1b3ee541d794aa1b252"} Sep 30 18:00:02 crc kubenswrapper[4772]: I0930 18:00:02.852595 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:02 crc kubenswrapper[4772]: I0930 18:00:02.867893 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c10a89ba-f623-46aa-88c6-a37a9bbf0052-secret-volume\") pod \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " Sep 30 18:00:02 crc kubenswrapper[4772]: I0930 18:00:02.867984 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lbnh\" (UniqueName: \"kubernetes.io/projected/c10a89ba-f623-46aa-88c6-a37a9bbf0052-kube-api-access-4lbnh\") pod \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " Sep 30 18:00:02 crc kubenswrapper[4772]: I0930 18:00:02.868412 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c10a89ba-f623-46aa-88c6-a37a9bbf0052-config-volume\") pod \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\" (UID: \"c10a89ba-f623-46aa-88c6-a37a9bbf0052\") " Sep 30 18:00:02 crc kubenswrapper[4772]: I0930 18:00:02.870260 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c10a89ba-f623-46aa-88c6-a37a9bbf0052-config-volume" (OuterVolumeSpecName: "config-volume") pod "c10a89ba-f623-46aa-88c6-a37a9bbf0052" (UID: "c10a89ba-f623-46aa-88c6-a37a9bbf0052"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:02 crc kubenswrapper[4772]: I0930 18:00:02.877516 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c10a89ba-f623-46aa-88c6-a37a9bbf0052-kube-api-access-4lbnh" (OuterVolumeSpecName: "kube-api-access-4lbnh") pod "c10a89ba-f623-46aa-88c6-a37a9bbf0052" (UID: "c10a89ba-f623-46aa-88c6-a37a9bbf0052"). InnerVolumeSpecName "kube-api-access-4lbnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:02 crc kubenswrapper[4772]: I0930 18:00:02.877877 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c10a89ba-f623-46aa-88c6-a37a9bbf0052-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c10a89ba-f623-46aa-88c6-a37a9bbf0052" (UID: "c10a89ba-f623-46aa-88c6-a37a9bbf0052"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:02 crc kubenswrapper[4772]: I0930 18:00:02.971887 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c10a89ba-f623-46aa-88c6-a37a9bbf0052-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:02 crc kubenswrapper[4772]: I0930 18:00:02.971946 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lbnh\" (UniqueName: \"kubernetes.io/projected/c10a89ba-f623-46aa-88c6-a37a9bbf0052-kube-api-access-4lbnh\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:02 crc kubenswrapper[4772]: I0930 18:00:02.971960 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c10a89ba-f623-46aa-88c6-a37a9bbf0052-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:03 crc kubenswrapper[4772]: I0930 18:00:03.504425 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" event={"ID":"c10a89ba-f623-46aa-88c6-a37a9bbf0052","Type":"ContainerDied","Data":"fa62946e4431e682f3b51775c04f7c638bc6287e1e73e1b3ee541d794aa1b252"} Sep 30 18:00:03 crc kubenswrapper[4772]: I0930 18:00:03.504832 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa62946e4431e682f3b51775c04f7c638bc6287e1e73e1b3ee541d794aa1b252" Sep 30 18:00:03 crc kubenswrapper[4772]: I0930 18:00:03.504520 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft" Sep 30 18:00:03 crc kubenswrapper[4772]: I0930 18:00:03.926841 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5"] Sep 30 18:00:03 crc kubenswrapper[4772]: I0930 18:00:03.934997 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320875-np9f5"] Sep 30 18:00:05 crc kubenswrapper[4772]: I0930 18:00:05.911218 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f91f03b-d4a4-4907-9b91-1f8098230413" path="/var/lib/kubelet/pods/3f91f03b-d4a4-4907-9b91-1f8098230413/volumes" Sep 30 18:00:08 crc kubenswrapper[4772]: I0930 18:00:08.656172 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:00:08 crc kubenswrapper[4772]: I0930 18:00:08.657081 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:00:08 crc kubenswrapper[4772]: I0930 18:00:08.657165 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 18:00:08 crc kubenswrapper[4772]: I0930 18:00:08.658599 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3833c2385cfa1ee8eb7c08c4dcf01f6d652b485c7a29505227f3ab3c212e162a"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:00:08 crc kubenswrapper[4772]: I0930 18:00:08.658687 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://3833c2385cfa1ee8eb7c08c4dcf01f6d652b485c7a29505227f3ab3c212e162a" gracePeriod=600 Sep 30 18:00:09 crc kubenswrapper[4772]: I0930 18:00:09.569681 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="3833c2385cfa1ee8eb7c08c4dcf01f6d652b485c7a29505227f3ab3c212e162a" exitCode=0 Sep 30 18:00:09 crc kubenswrapper[4772]: I0930 18:00:09.569769 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"3833c2385cfa1ee8eb7c08c4dcf01f6d652b485c7a29505227f3ab3c212e162a"} Sep 30 18:00:09 crc kubenswrapper[4772]: I0930 18:00:09.570134 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb"} Sep 30 18:00:09 crc kubenswrapper[4772]: I0930 18:00:09.570175 4772 scope.go:117] "RemoveContainer" containerID="5d8356be428c6c588660f319d57b20ab18519b2d89eb778ec1f64f18d3c1f7fe" Sep 30 18:00:45 crc kubenswrapper[4772]: I0930 18:00:45.329322 4772 scope.go:117] "RemoveContainer" containerID="a905f9a2e34e13db6089bc782da3476e7115e86507f9acc4c8f663ddb440e72e" Sep 30 18:00:53 crc kubenswrapper[4772]: I0930 18:00:53.936302 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dk26v"] Sep 30 18:00:53 crc kubenswrapper[4772]: E0930 18:00:53.937328 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c10a89ba-f623-46aa-88c6-a37a9bbf0052" containerName="collect-profiles" Sep 30 18:00:53 crc kubenswrapper[4772]: I0930 18:00:53.937347 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c10a89ba-f623-46aa-88c6-a37a9bbf0052" containerName="collect-profiles" Sep 30 18:00:53 crc kubenswrapper[4772]: I0930 18:00:53.937609 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c10a89ba-f623-46aa-88c6-a37a9bbf0052" containerName="collect-profiles" Sep 30 18:00:53 crc kubenswrapper[4772]: I0930 18:00:53.939968 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:53 crc kubenswrapper[4772]: I0930 18:00:53.952533 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dk26v"] Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.034384 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k6jq\" (UniqueName: \"kubernetes.io/projected/d7391e8f-ca60-476e-87b5-85c262a8be29-kube-api-access-9k6jq\") pod \"redhat-operators-dk26v\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.035244 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-catalog-content\") pod \"redhat-operators-dk26v\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.035661 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-utilities\") pod \"redhat-operators-dk26v\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.138710 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k6jq\" (UniqueName: \"kubernetes.io/projected/d7391e8f-ca60-476e-87b5-85c262a8be29-kube-api-access-9k6jq\") pod \"redhat-operators-dk26v\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.139020 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-catalog-content\") pod \"redhat-operators-dk26v\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.139380 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-utilities\") pod \"redhat-operators-dk26v\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.139635 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-catalog-content\") pod \"redhat-operators-dk26v\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.139758 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-utilities\") pod \"redhat-operators-dk26v\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.161893 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k6jq\" (UniqueName: \"kubernetes.io/projected/d7391e8f-ca60-476e-87b5-85c262a8be29-kube-api-access-9k6jq\") pod \"redhat-operators-dk26v\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.261212 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:00:54 crc kubenswrapper[4772]: I0930 18:00:54.809867 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dk26v"] Sep 30 18:00:55 crc kubenswrapper[4772]: I0930 18:00:55.000141 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk26v" event={"ID":"d7391e8f-ca60-476e-87b5-85c262a8be29","Type":"ContainerStarted","Data":"627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f"} Sep 30 18:00:55 crc kubenswrapper[4772]: I0930 18:00:55.000473 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk26v" event={"ID":"d7391e8f-ca60-476e-87b5-85c262a8be29","Type":"ContainerStarted","Data":"d8c01b92b5a341f05b3511770c416b08ac2e68ae9740d553c1761fb7f03bcfa3"} Sep 30 18:00:56 crc kubenswrapper[4772]: I0930 18:00:56.009443 4772 generic.go:334] "Generic (PLEG): container finished" podID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerID="627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f" exitCode=0 Sep 30 18:00:56 crc kubenswrapper[4772]: I0930 18:00:56.009497 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk26v" event={"ID":"d7391e8f-ca60-476e-87b5-85c262a8be29","Type":"ContainerDied","Data":"627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f"} Sep 30 18:00:58 crc kubenswrapper[4772]: I0930 18:00:58.026752 4772 generic.go:334] "Generic (PLEG): container finished" podID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerID="4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510" exitCode=0 Sep 30 18:00:58 crc kubenswrapper[4772]: I0930 18:00:58.026949 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk26v" event={"ID":"d7391e8f-ca60-476e-87b5-85c262a8be29","Type":"ContainerDied","Data":"4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510"} Sep 30 18:00:59 crc kubenswrapper[4772]: I0930 18:00:59.037234 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk26v" event={"ID":"d7391e8f-ca60-476e-87b5-85c262a8be29","Type":"ContainerStarted","Data":"81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2"} Sep 30 18:00:59 crc kubenswrapper[4772]: I0930 18:00:59.054447 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dk26v" podStartSLOduration=3.623525709 podStartE2EDuration="6.054424549s" podCreationTimestamp="2025-09-30 18:00:53 +0000 UTC" firstStartedPulling="2025-09-30 18:00:56.012141536 +0000 UTC m=+3556.919154367" lastFinishedPulling="2025-09-30 18:00:58.443040356 +0000 UTC m=+3559.350053207" observedRunningTime="2025-09-30 18:00:59.052349644 +0000 UTC m=+3559.959362475" watchObservedRunningTime="2025-09-30 18:00:59.054424549 +0000 UTC m=+3559.961437380" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.151871 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320921-q2m64"] Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.153903 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.164284 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320921-q2m64"] Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.271272 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-combined-ca-bundle\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.271326 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw4gl\" (UniqueName: \"kubernetes.io/projected/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-kube-api-access-rw4gl\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.271362 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-config-data\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.271435 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-fernet-keys\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.372971 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-combined-ca-bundle\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.373244 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw4gl\" (UniqueName: \"kubernetes.io/projected/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-kube-api-access-rw4gl\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.373343 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-config-data\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.373436 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-fernet-keys\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.383516 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-config-data\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.391894 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-combined-ca-bundle\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.392116 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-fernet-keys\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.402200 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw4gl\" (UniqueName: \"kubernetes.io/projected/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-kube-api-access-rw4gl\") pod \"keystone-cron-29320921-q2m64\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.471693 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:00 crc kubenswrapper[4772]: I0930 18:01:00.977083 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320921-q2m64"] Sep 30 18:01:01 crc kubenswrapper[4772]: I0930 18:01:01.055175 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320921-q2m64" event={"ID":"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7","Type":"ContainerStarted","Data":"253a11033c8eae66f044025c57876c2cea6fb20cb81fd5099e123bf458d25930"} Sep 30 18:01:02 crc kubenswrapper[4772]: I0930 18:01:02.064933 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320921-q2m64" event={"ID":"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7","Type":"ContainerStarted","Data":"181982e4aeb02871f4dc8b4fe448088a4150e1d3671a78180a096b190b7eaaea"} Sep 30 18:01:02 crc kubenswrapper[4772]: I0930 18:01:02.086766 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320921-q2m64" podStartSLOduration=2.0867455 podStartE2EDuration="2.0867455s" podCreationTimestamp="2025-09-30 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:01:02.083636978 +0000 UTC m=+3562.990649809" watchObservedRunningTime="2025-09-30 18:01:02.0867455 +0000 UTC m=+3562.993758341" Sep 30 18:01:04 crc kubenswrapper[4772]: I0930 18:01:04.088724 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7" containerID="181982e4aeb02871f4dc8b4fe448088a4150e1d3671a78180a096b190b7eaaea" exitCode=0 Sep 30 18:01:04 crc kubenswrapper[4772]: I0930 18:01:04.089286 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320921-q2m64" event={"ID":"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7","Type":"ContainerDied","Data":"181982e4aeb02871f4dc8b4fe448088a4150e1d3671a78180a096b190b7eaaea"} Sep 30 18:01:04 crc kubenswrapper[4772]: I0930 18:01:04.261541 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:01:04 crc kubenswrapper[4772]: I0930 18:01:04.261590 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:01:04 crc kubenswrapper[4772]: I0930 18:01:04.304509 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.142567 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.198496 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dk26v"] Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.407147 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.477325 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-fernet-keys\") pod \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.477471 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw4gl\" (UniqueName: \"kubernetes.io/projected/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-kube-api-access-rw4gl\") pod \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.477516 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-combined-ca-bundle\") pod \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.477594 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-config-data\") pod \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\" (UID: \"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7\") " Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.483014 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7" (UID: "3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.485297 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-kube-api-access-rw4gl" (OuterVolumeSpecName: "kube-api-access-rw4gl") pod "3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7" (UID: "3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7"). InnerVolumeSpecName "kube-api-access-rw4gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.512337 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7" (UID: "3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.528232 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-config-data" (OuterVolumeSpecName: "config-data") pod "3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7" (UID: "3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.580578 4772 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.580638 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw4gl\" (UniqueName: \"kubernetes.io/projected/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-kube-api-access-rw4gl\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.580661 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:05 crc kubenswrapper[4772]: I0930 18:01:05.580676 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:06 crc kubenswrapper[4772]: I0930 18:01:06.153968 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320921-q2m64" Sep 30 18:01:06 crc kubenswrapper[4772]: I0930 18:01:06.154684 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320921-q2m64" event={"ID":"3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7","Type":"ContainerDied","Data":"253a11033c8eae66f044025c57876c2cea6fb20cb81fd5099e123bf458d25930"} Sep 30 18:01:06 crc kubenswrapper[4772]: I0930 18:01:06.155076 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="253a11033c8eae66f044025c57876c2cea6fb20cb81fd5099e123bf458d25930" Sep 30 18:01:07 crc kubenswrapper[4772]: I0930 18:01:07.160230 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dk26v" podUID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerName="registry-server" containerID="cri-o://81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2" gracePeriod=2 Sep 30 18:01:07 crc kubenswrapper[4772]: I0930 18:01:07.629122 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:01:07 crc kubenswrapper[4772]: I0930 18:01:07.718951 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-catalog-content\") pod \"d7391e8f-ca60-476e-87b5-85c262a8be29\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " Sep 30 18:01:07 crc kubenswrapper[4772]: I0930 18:01:07.719245 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k6jq\" (UniqueName: \"kubernetes.io/projected/d7391e8f-ca60-476e-87b5-85c262a8be29-kube-api-access-9k6jq\") pod \"d7391e8f-ca60-476e-87b5-85c262a8be29\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " Sep 30 18:01:07 crc kubenswrapper[4772]: I0930 18:01:07.719352 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-utilities\") pod \"d7391e8f-ca60-476e-87b5-85c262a8be29\" (UID: \"d7391e8f-ca60-476e-87b5-85c262a8be29\") " Sep 30 18:01:07 crc kubenswrapper[4772]: I0930 18:01:07.720301 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-utilities" (OuterVolumeSpecName: "utilities") pod "d7391e8f-ca60-476e-87b5-85c262a8be29" (UID: "d7391e8f-ca60-476e-87b5-85c262a8be29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:01:07 crc kubenswrapper[4772]: I0930 18:01:07.725777 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7391e8f-ca60-476e-87b5-85c262a8be29-kube-api-access-9k6jq" (OuterVolumeSpecName: "kube-api-access-9k6jq") pod "d7391e8f-ca60-476e-87b5-85c262a8be29" (UID: "d7391e8f-ca60-476e-87b5-85c262a8be29"). InnerVolumeSpecName "kube-api-access-9k6jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:07 crc kubenswrapper[4772]: I0930 18:01:07.822178 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k6jq\" (UniqueName: \"kubernetes.io/projected/d7391e8f-ca60-476e-87b5-85c262a8be29-kube-api-access-9k6jq\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:07 crc kubenswrapper[4772]: I0930 18:01:07.822223 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.175190 4772 generic.go:334] "Generic (PLEG): container finished" podID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerID="81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2" exitCode=0 Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.175260 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk26v" event={"ID":"d7391e8f-ca60-476e-87b5-85c262a8be29","Type":"ContainerDied","Data":"81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2"} Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.175303 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk26v" event={"ID":"d7391e8f-ca60-476e-87b5-85c262a8be29","Type":"ContainerDied","Data":"d8c01b92b5a341f05b3511770c416b08ac2e68ae9740d553c1761fb7f03bcfa3"} Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.175329 4772 scope.go:117] "RemoveContainer" containerID="81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.175532 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk26v" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.217848 4772 scope.go:117] "RemoveContainer" containerID="4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.250724 4772 scope.go:117] "RemoveContainer" containerID="627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.299610 4772 scope.go:117] "RemoveContainer" containerID="81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2" Sep 30 18:01:08 crc kubenswrapper[4772]: E0930 18:01:08.300203 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2\": container with ID starting with 81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2 not found: ID does not exist" containerID="81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.300235 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2"} err="failed to get container status \"81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2\": rpc error: code = NotFound desc = could not find container \"81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2\": container with ID starting with 81a821b887ab4d0a7ca77855ad3a61cf1262101b292cefa384f26b0062013fe2 not found: ID does not exist" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.300257 4772 scope.go:117] "RemoveContainer" containerID="4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510" Sep 30 18:01:08 crc kubenswrapper[4772]: E0930 18:01:08.300754 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510\": container with ID starting with 4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510 not found: ID does not exist" containerID="4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.300853 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510"} err="failed to get container status \"4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510\": rpc error: code = NotFound desc = could not find container \"4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510\": container with ID starting with 4515664a64144c9e6196b25688469106b32a5a19f4ce562d34ef12e8215fd510 not found: ID does not exist" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.300898 4772 scope.go:117] "RemoveContainer" containerID="627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f" Sep 30 18:01:08 crc kubenswrapper[4772]: E0930 18:01:08.301419 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f\": container with ID starting with 627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f not found: ID does not exist" containerID="627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.301454 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f"} err="failed to get container status \"627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f\": rpc error: code = NotFound desc = could not find container \"627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f\": container with ID starting with 627a9c4a175f81cfb757d8bfbeebe14506095c046437472dca6e4c06af1d4b5f not found: ID does not exist" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.415882 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7391e8f-ca60-476e-87b5-85c262a8be29" (UID: "d7391e8f-ca60-476e-87b5-85c262a8be29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.443462 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7391e8f-ca60-476e-87b5-85c262a8be29-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.504612 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dk26v"] Sep 30 18:01:08 crc kubenswrapper[4772]: I0930 18:01:08.513130 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dk26v"] Sep 30 18:01:09 crc kubenswrapper[4772]: I0930 18:01:09.920994 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7391e8f-ca60-476e-87b5-85c262a8be29" path="/var/lib/kubelet/pods/d7391e8f-ca60-476e-87b5-85c262a8be29/volumes" Sep 30 18:01:49 crc kubenswrapper[4772]: I0930 18:01:49.625839 4772 generic.go:334] "Generic (PLEG): container finished" podID="96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" containerID="c80581a06b0ce2a648a77ba5c5227766985e3bee424e72cb0b7733f761667221" exitCode=0 Sep 30 18:01:49 crc kubenswrapper[4772]: I0930 18:01:49.625939 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" event={"ID":"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5","Type":"ContainerDied","Data":"c80581a06b0ce2a648a77ba5c5227766985e3bee424e72cb0b7733f761667221"} Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.119926 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.247950 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnx8t\" (UniqueName: \"kubernetes.io/projected/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-kube-api-access-rnx8t\") pod \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.249598 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-inventory\") pod \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.249779 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-1\") pod \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.250204 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ssh-key\") pod \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.250349 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceph\") pod \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.250469 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-0\") pod \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.250610 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-2\") pod \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.250768 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-telemetry-combined-ca-bundle\") pod \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\" (UID: \"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5\") " Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.258256 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-kube-api-access-rnx8t" (OuterVolumeSpecName: "kube-api-access-rnx8t") pod "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" (UID: "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5"). InnerVolumeSpecName "kube-api-access-rnx8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.259634 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" (UID: "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.262467 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceph" (OuterVolumeSpecName: "ceph") pod "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" (UID: "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.285337 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" (UID: "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.286149 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-inventory" (OuterVolumeSpecName: "inventory") pod "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" (UID: "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.286987 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" (UID: "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.293763 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" (UID: "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.297554 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" (UID: "96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.353714 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.353752 4772 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.353765 4772 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.353775 4772 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.353786 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnx8t\" (UniqueName: \"kubernetes.io/projected/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-kube-api-access-rnx8t\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.353794 4772 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.353803 4772 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.353811 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.667307 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" event={"ID":"96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5","Type":"ContainerDied","Data":"5c9573bf4d93d635fa49f17c35ada3b9a1def21f436576fc2c6688ab48304000"} Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.667387 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c9573bf4d93d635fa49f17c35ada3b9a1def21f436576fc2c6688ab48304000" Sep 30 18:01:51 crc kubenswrapper[4772]: I0930 18:01:51.667352 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k2jql" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.503849 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Sep 30 18:02:11 crc kubenswrapper[4772]: E0930 18:02:11.505155 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerName="extract-utilities" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.505175 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerName="extract-utilities" Sep 30 18:02:11 crc kubenswrapper[4772]: E0930 18:02:11.505221 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7" containerName="keystone-cron" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.505230 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7" containerName="keystone-cron" Sep 30 18:02:11 crc kubenswrapper[4772]: E0930 18:02:11.505257 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.505266 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 18:02:11 crc kubenswrapper[4772]: E0930 18:02:11.505288 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerName="registry-server" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.505298 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerName="registry-server" Sep 30 18:02:11 crc kubenswrapper[4772]: E0930 18:02:11.505321 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerName="extract-content" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.505331 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerName="extract-content" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.505574 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.505590 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7" containerName="keystone-cron" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.505605 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7391e8f-ca60-476e-87b5-85c262a8be29" containerName="registry-server" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.507030 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.511665 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.511893 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.513104 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.515191 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.518366 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.524875 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.537652 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601324 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-sys\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601381 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-lib-modules\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601420 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601449 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601472 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601519 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601546 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-run\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601579 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601623 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601686 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601710 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-scripts\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601735 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601758 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-sys\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601829 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxpqv\" (UniqueName: \"kubernetes.io/projected/b6b0b394-e87c-4287-ab65-5652e2cc09e1-kube-api-access-sxpqv\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601923 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-dev\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.601978 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602000 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scqkx\" (UniqueName: \"kubernetes.io/projected/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-kube-api-access-scqkx\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602023 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602074 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602133 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-dev\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602163 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602190 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602220 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b6b0b394-e87c-4287-ab65-5652e2cc09e1-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602261 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602296 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602341 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-config-data\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602357 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602377 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602394 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602409 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-run\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602424 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-ceph\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.602439 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.703992 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704147 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-config-data\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704173 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704202 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704228 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-run\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704248 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704269 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-ceph\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704293 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704342 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-lib-modules\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704364 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-sys\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704395 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704423 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704447 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704477 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704501 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-run\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704530 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704587 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704630 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-scripts\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704653 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704680 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704705 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-sys\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704732 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxpqv\" (UniqueName: \"kubernetes.io/projected/b6b0b394-e87c-4287-ab65-5652e2cc09e1-kube-api-access-sxpqv\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704758 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-dev\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704822 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scqkx\" (UniqueName: \"kubernetes.io/projected/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-kube-api-access-scqkx\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704850 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704873 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704904 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-dev\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704924 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704944 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704965 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b6b0b394-e87c-4287-ab65-5652e2cc09e1-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.704996 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705504 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705609 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705793 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-dev\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705831 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-sys\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705867 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705758 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705748 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705672 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705641 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-lib-modules\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705938 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705763 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705911 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705940 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-dev\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705663 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-sys\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.705996 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.706019 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-run\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.706047 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-run\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.706122 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.706314 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b6b0b394-e87c-4287-ab65-5652e2cc09e1-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.706473 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.715976 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.724799 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-ceph\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.725586 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.727943 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.729788 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b6b0b394-e87c-4287-ab65-5652e2cc09e1-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.730417 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.731697 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.731948 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-config-data\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.732990 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scqkx\" (UniqueName: \"kubernetes.io/projected/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-kube-api-access-scqkx\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.734235 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058-scripts\") pod \"cinder-backup-0\" (UID: \"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058\") " pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.743523 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxpqv\" (UniqueName: \"kubernetes.io/projected/b6b0b394-e87c-4287-ab65-5652e2cc09e1-kube-api-access-sxpqv\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.744236 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b0b394-e87c-4287-ab65-5652e2cc09e1-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"b6b0b394-e87c-4287-ab65-5652e2cc09e1\") " pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.837178 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.843384 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume2-0"] Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.845817 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.856041 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.858579 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume2-config-data" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.868672 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume2-0"] Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909503 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-dev\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909571 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-config-data-custom\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909601 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-config-data\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909652 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-sys\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909696 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-combined-ca-bundle\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909726 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-etc-machine-id\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909759 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-scripts\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909793 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-run\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909809 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m929\" (UniqueName: \"kubernetes.io/projected/08a95766-93a6-47b7-bce4-c556f7064db0-kube-api-access-6m929\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909832 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-var-locks-brick\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909861 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-lib-modules\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909903 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-etc-nvme\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909929 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/08a95766-93a6-47b7-bce4-c556f7064db0-ceph\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909958 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-etc-iscsi\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909979 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-var-locks-cinder\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:11 crc kubenswrapper[4772]: I0930 18:02:11.909998 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-var-lib-cinder\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.011730 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-lib-modules\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.011793 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-etc-nvme\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.011823 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/08a95766-93a6-47b7-bce4-c556f7064db0-ceph\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.011857 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-etc-iscsi\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.011887 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-var-locks-cinder\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.011912 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-var-lib-cinder\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.011969 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-dev\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-config-data-custom\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012033 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-config-data\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012089 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-sys\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012131 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-combined-ca-bundle\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012165 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-etc-machine-id\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012198 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-scripts\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012227 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-run\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012247 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m929\" (UniqueName: \"kubernetes.io/projected/08a95766-93a6-47b7-bce4-c556f7064db0-kube-api-access-6m929\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012281 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-var-locks-brick\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012397 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-var-locks-brick\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012461 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-etc-iscsi\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012509 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-var-locks-cinder\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012551 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-var-lib-cinder\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012581 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-dev\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012830 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-etc-machine-id\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012916 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-sys\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.012953 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-run\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.013360 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-lib-modules\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.013416 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/08a95766-93a6-47b7-bce4-c556f7064db0-etc-nvme\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.024393 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/08a95766-93a6-47b7-bce4-c556f7064db0-ceph\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.029396 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-config-data\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.030635 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-scripts\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.035903 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-combined-ca-bundle\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.044173 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m929\" (UniqueName: \"kubernetes.io/projected/08a95766-93a6-47b7-bce4-c556f7064db0-kube-api-access-6m929\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.058858 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a95766-93a6-47b7-bce4-c556f7064db0-config-data-custom\") pod \"cinder-volume-volume2-0\" (UID: \"08a95766-93a6-47b7-bce4-c556f7064db0\") " pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.110082 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6fdbf867b9-w8nzc"] Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.111733 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.120661 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.120904 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-xgs5d" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.121207 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.121866 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.168013 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fdbf867b9-w8nzc"] Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.204208 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.206673 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.220574 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.220899 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.221705 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.227007 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sgs5q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.329732 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8f594cf49-g777q"] Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.337594 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.338174 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.342158 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.342326 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.342420 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-config-data\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.345063 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhh4h\" (UniqueName: \"kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-kube-api-access-hhh4h\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.345537 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e64d71ac-e536-4893-9433-c4c0154635a7-logs\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.345695 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.345907 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e64d71ac-e536-4893-9433-c4c0154635a7-horizon-secret-key\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.347156 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.348260 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-scripts\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.348421 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.348750 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-logs\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.348834 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.348924 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtjnx\" (UniqueName: \"kubernetes.io/projected/e64d71ac-e536-4893-9433-c4c0154635a7-kube-api-access-vtjnx\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.349043 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.364444 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.381966 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.389021 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.393457 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.394024 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.437265 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8f594cf49-g777q"] Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.454515 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.454832 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-scripts\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.454902 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.454933 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-scripts\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.454953 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.454973 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-logs\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.454998 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7d014f7-ce9c-4749-82bb-320ff97777a4-logs\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455016 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455072 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtjnx\" (UniqueName: \"kubernetes.io/projected/e64d71ac-e536-4893-9433-c4c0154635a7-kube-api-access-vtjnx\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455101 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455123 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f7d014f7-ce9c-4749-82bb-320ff97777a4-horizon-secret-key\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455147 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455179 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455197 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b72wl\" (UniqueName: \"kubernetes.io/projected/f7d014f7-ce9c-4749-82bb-320ff97777a4-kube-api-access-b72wl\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455221 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-config-data\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455243 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhh4h\" (UniqueName: \"kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-kube-api-access-hhh4h\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455298 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e64d71ac-e536-4893-9433-c4c0154635a7-logs\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455317 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455343 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e64d71ac-e536-4893-9433-c4c0154635a7-horizon-secret-key\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455372 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-config-data\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.455837 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.456137 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-scripts\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.456804 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.457081 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-logs\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.459659 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-config-data\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.464443 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e64d71ac-e536-4893-9433-c4c0154635a7-logs\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.470127 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.470286 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.472645 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e64d71ac-e536-4893-9433-c4c0154635a7-horizon-secret-key\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.481602 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.494454 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.510427 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtjnx\" (UniqueName: \"kubernetes.io/projected/e64d71ac-e536-4893-9433-c4c0154635a7-kube-api-access-vtjnx\") pod \"horizon-6fdbf867b9-w8nzc\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: E0930 18:02:12.511625 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-w8lzv logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="bb88db36-3be1-4ed9-902e-b60277f90513" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.517958 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhh4h\" (UniqueName: \"kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-kube-api-access-hhh4h\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.519071 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.519168 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.525335 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558485 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b72wl\" (UniqueName: \"kubernetes.io/projected/f7d014f7-ce9c-4749-82bb-320ff97777a4-kube-api-access-b72wl\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558562 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-logs\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558618 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8lzv\" (UniqueName: \"kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-kube-api-access-w8lzv\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558678 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-config-data\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558710 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558738 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558757 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558791 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-scripts\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558825 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-scripts\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558894 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7d014f7-ce9c-4749-82bb-320ff97777a4-logs\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558915 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558948 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-config-data\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.558975 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-ceph\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.559028 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f7d014f7-ce9c-4749-82bb-320ff97777a4-horizon-secret-key\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.560255 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-scripts\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.560737 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7d014f7-ce9c-4749-82bb-320ff97777a4-logs\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.562011 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-config-data\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.566740 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f7d014f7-ce9c-4749-82bb-320ff97777a4-horizon-secret-key\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.599847 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b72wl\" (UniqueName: \"kubernetes.io/projected/f7d014f7-ce9c-4749-82bb-320ff97777a4-kube-api-access-b72wl\") pod \"horizon-8f594cf49-g777q\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.660782 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-logs\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.660927 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8lzv\" (UniqueName: \"kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-kube-api-access-w8lzv\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.661322 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.661356 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.661920 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.661963 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-scripts\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.662038 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.662150 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.662320 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-config-data\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.662462 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-ceph\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.661521 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-logs\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.662340 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.667845 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.672516 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-scripts\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.676698 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.683123 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8lzv\" (UniqueName: \"kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-kube-api-access-w8lzv\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.683683 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-ceph\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.684649 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-config-data\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.708082 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.718776 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.747410 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.785350 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.788434 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.945307 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.945470 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058","Type":"ContainerStarted","Data":"b07f3e41a8f18a2760b0373262f665eb32eea18a6f44ab7684c8a1b5b025aa27"} Sep 30 18:02:12 crc kubenswrapper[4772]: I0930 18:02:12.961692 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.066308 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.072114 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8lzv\" (UniqueName: \"kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-kube-api-access-w8lzv\") pod \"bb88db36-3be1-4ed9-902e-b60277f90513\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.072174 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-ceph\") pod \"bb88db36-3be1-4ed9-902e-b60277f90513\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.072210 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"bb88db36-3be1-4ed9-902e-b60277f90513\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.072315 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-scripts\") pod \"bb88db36-3be1-4ed9-902e-b60277f90513\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.072347 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-logs\") pod \"bb88db36-3be1-4ed9-902e-b60277f90513\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.072383 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-httpd-run\") pod \"bb88db36-3be1-4ed9-902e-b60277f90513\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.072425 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-public-tls-certs\") pod \"bb88db36-3be1-4ed9-902e-b60277f90513\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.072601 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-combined-ca-bundle\") pod \"bb88db36-3be1-4ed9-902e-b60277f90513\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.072658 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-config-data\") pod \"bb88db36-3be1-4ed9-902e-b60277f90513\" (UID: \"bb88db36-3be1-4ed9-902e-b60277f90513\") " Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.076590 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-logs" (OuterVolumeSpecName: "logs") pod "bb88db36-3be1-4ed9-902e-b60277f90513" (UID: "bb88db36-3be1-4ed9-902e-b60277f90513"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.076886 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bb88db36-3be1-4ed9-902e-b60277f90513" (UID: "bb88db36-3be1-4ed9-902e-b60277f90513"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.083364 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-ceph" (OuterVolumeSpecName: "ceph") pod "bb88db36-3be1-4ed9-902e-b60277f90513" (UID: "bb88db36-3be1-4ed9-902e-b60277f90513"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.084019 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bb88db36-3be1-4ed9-902e-b60277f90513" (UID: "bb88db36-3be1-4ed9-902e-b60277f90513"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.084132 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-config-data" (OuterVolumeSpecName: "config-data") pod "bb88db36-3be1-4ed9-902e-b60277f90513" (UID: "bb88db36-3be1-4ed9-902e-b60277f90513"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.084147 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "bb88db36-3be1-4ed9-902e-b60277f90513" (UID: "bb88db36-3be1-4ed9-902e-b60277f90513"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.084237 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-scripts" (OuterVolumeSpecName: "scripts") pod "bb88db36-3be1-4ed9-902e-b60277f90513" (UID: "bb88db36-3be1-4ed9-902e-b60277f90513"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.086010 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-kube-api-access-w8lzv" (OuterVolumeSpecName: "kube-api-access-w8lzv") pod "bb88db36-3be1-4ed9-902e-b60277f90513" (UID: "bb88db36-3be1-4ed9-902e-b60277f90513"). InnerVolumeSpecName "kube-api-access-w8lzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.087251 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb88db36-3be1-4ed9-902e-b60277f90513" (UID: "bb88db36-3be1-4ed9-902e-b60277f90513"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.175422 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.175760 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.175774 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.175785 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.175798 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8lzv\" (UniqueName: \"kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-kube-api-access-w8lzv\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.175811 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bb88db36-3be1-4ed9-902e-b60277f90513-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.175860 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.175876 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb88db36-3be1-4ed9-902e-b60277f90513-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.175886 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb88db36-3be1-4ed9-902e-b60277f90513-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.182950 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume2-0"] Sep 30 18:02:13 crc kubenswrapper[4772]: W0930 18:02:13.195406 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a95766_93a6_47b7_bce4_c556f7064db0.slice/crio-ccbcb65beb14debe8f538ecbc3c5b8eb7c29fc0433b54f107010b02aca58b51d WatchSource:0}: Error finding container ccbcb65beb14debe8f538ecbc3c5b8eb7c29fc0433b54f107010b02aca58b51d: Status 404 returned error can't find the container with id ccbcb65beb14debe8f538ecbc3c5b8eb7c29fc0433b54f107010b02aca58b51d Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.229855 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.279799 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.536482 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8f594cf49-g777q"] Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.643140 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fdbf867b9-w8nzc"] Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.793946 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.987666 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058","Type":"ContainerStarted","Data":"7f9730b0294dca1afbe412fafc93b36ee691def450c664f4ba30dc1389e84a55"} Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.996410 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"b6b0b394-e87c-4287-ab65-5652e2cc09e1","Type":"ContainerStarted","Data":"c6488cc2b7b5d1c1d9bf819652cb843dd99ca94a816939c5a21295d9ca6baba7"} Sep 30 18:02:13 crc kubenswrapper[4772]: I0930 18:02:13.999034 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d437fea-02e8-48b7-8991-fb48453b6246","Type":"ContainerStarted","Data":"93376d776d84bab9d635ea658825f838cbf192cee0574e3065b1f536810f2585"} Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.002864 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8f594cf49-g777q" event={"ID":"f7d014f7-ce9c-4749-82bb-320ff97777a4","Type":"ContainerStarted","Data":"7e5fc2b3a7aacc87836be2cc4669d1d5ddc7fd02fb379a52c9d28eaf3f69c631"} Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.004889 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume2-0" event={"ID":"08a95766-93a6-47b7-bce4-c556f7064db0","Type":"ContainerStarted","Data":"ccbcb65beb14debe8f538ecbc3c5b8eb7c29fc0433b54f107010b02aca58b51d"} Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.005896 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.005885 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fdbf867b9-w8nzc" event={"ID":"e64d71ac-e536-4893-9433-c4c0154635a7","Type":"ContainerStarted","Data":"d3767dbeb39f91c2efdee34b1e0a6b532809769a2b34857c24980f3e1db7425e"} Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.148324 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.206778 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.251254 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.261740 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.265366 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.265703 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.290807 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.348795 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-ceph\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.352549 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.352714 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.352760 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.352825 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgz6h\" (UniqueName: \"kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-kube-api-access-xgz6h\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.353011 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.353104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.353230 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-logs\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.353622 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.456311 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.456395 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-ceph\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.456429 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.456465 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.456485 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.456507 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgz6h\" (UniqueName: \"kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-kube-api-access-xgz6h\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.456564 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.456586 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.456622 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-logs\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.457112 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-logs\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.457969 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.459708 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.463144 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.463712 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.464246 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.465227 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.470738 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-ceph\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.475307 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgz6h\" (UniqueName: \"kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-kube-api-access-xgz6h\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.514905 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:14 crc kubenswrapper[4772]: I0930 18:02:14.643857 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.036866 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"b6b0b394-e87c-4287-ab65-5652e2cc09e1","Type":"ContainerStarted","Data":"cea14b57bdf5cb62439d60025e0bf54af24865080a9d544495296bbdc79c5fb0"} Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.037629 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"b6b0b394-e87c-4287-ab65-5652e2cc09e1","Type":"ContainerStarted","Data":"20a7163db86cbed4530681ac05afc3dd6c4c3759be6cae94595790c6c2dcb3b1"} Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.045026 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume2-0" event={"ID":"08a95766-93a6-47b7-bce4-c556f7064db0","Type":"ContainerStarted","Data":"63d008773f3ba34ec4f2179f76649e7cec9525696ee7dd2f8bf042bdd71025a8"} Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.045096 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume2-0" event={"ID":"08a95766-93a6-47b7-bce4-c556f7064db0","Type":"ContainerStarted","Data":"ca88c6e6c0de26a1dd22b79d0f2e236352575bfd99ba02b22c4215e22e08a1d5"} Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.051752 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058","Type":"ContainerStarted","Data":"7ad5580671d38859aab423871eccae1de2926c39b64266b117e9004dfdec6163"} Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.090164 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.487956023 podStartE2EDuration="4.090132005s" podCreationTimestamp="2025-09-30 18:02:11 +0000 UTC" firstStartedPulling="2025-09-30 18:02:13.12579279 +0000 UTC m=+3634.032805621" lastFinishedPulling="2025-09-30 18:02:13.727968772 +0000 UTC m=+3634.634981603" observedRunningTime="2025-09-30 18:02:15.063267448 +0000 UTC m=+3635.970280289" watchObservedRunningTime="2025-09-30 18:02:15.090132005 +0000 UTC m=+3635.997144846" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.111040 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.746715401 podStartE2EDuration="4.111006074s" podCreationTimestamp="2025-09-30 18:02:11 +0000 UTC" firstStartedPulling="2025-09-30 18:02:12.761487367 +0000 UTC m=+3633.668500198" lastFinishedPulling="2025-09-30 18:02:13.12577804 +0000 UTC m=+3634.032790871" observedRunningTime="2025-09-30 18:02:15.094432728 +0000 UTC m=+3636.001445579" watchObservedRunningTime="2025-09-30 18:02:15.111006074 +0000 UTC m=+3636.018018905" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.128120 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume2-0" podStartSLOduration=3.6022380309999997 podStartE2EDuration="4.128084954s" podCreationTimestamp="2025-09-30 18:02:11 +0000 UTC" firstStartedPulling="2025-09-30 18:02:13.201235145 +0000 UTC m=+3634.108247976" lastFinishedPulling="2025-09-30 18:02:13.727082068 +0000 UTC m=+3634.634094899" observedRunningTime="2025-09-30 18:02:15.117540826 +0000 UTC m=+3636.024553667" watchObservedRunningTime="2025-09-30 18:02:15.128084954 +0000 UTC m=+3636.035097785" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.352153 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.434270 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fdbf867b9-w8nzc"] Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.484407 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.501205 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bc48b88d8-rt7kf"] Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.503359 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.517226 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.544048 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bc48b88d8-rt7kf"] Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.574795 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.591133 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8f594cf49-g777q"] Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.610796 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d9mv\" (UniqueName: \"kubernetes.io/projected/8de8baac-0d72-460d-83d5-1a96b08ce0cb-kube-api-access-7d9mv\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.610879 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-tls-certs\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.618449 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8de8baac-0d72-460d-83d5-1a96b08ce0cb-logs\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.618567 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-config-data\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.618670 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-secret-key\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.618971 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-scripts\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.619107 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-combined-ca-bundle\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.636143 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5dd79c8f84-lx2fj"] Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.638114 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.646900 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dd79c8f84-lx2fj"] Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722333 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb98c606-aef7-46e5-8242-7ebd28d542ba-logs\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722423 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d9mv\" (UniqueName: \"kubernetes.io/projected/8de8baac-0d72-460d-83d5-1a96b08ce0cb-kube-api-access-7d9mv\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722463 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-tls-certs\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722497 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8de8baac-0d72-460d-83d5-1a96b08ce0cb-logs\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722534 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb98c606-aef7-46e5-8242-7ebd28d542ba-config-data\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722562 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-config-data\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722589 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb98c606-aef7-46e5-8242-7ebd28d542ba-scripts\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722612 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-secret-key\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722650 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ght9q\" (UniqueName: \"kubernetes.io/projected/bb98c606-aef7-46e5-8242-7ebd28d542ba-kube-api-access-ght9q\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722673 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb98c606-aef7-46e5-8242-7ebd28d542ba-horizon-tls-certs\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722725 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-scripts\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722757 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb98c606-aef7-46e5-8242-7ebd28d542ba-combined-ca-bundle\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722778 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-combined-ca-bundle\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.722792 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bb98c606-aef7-46e5-8242-7ebd28d542ba-horizon-secret-key\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.726251 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-config-data\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.729279 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8de8baac-0d72-460d-83d5-1a96b08ce0cb-logs\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.729695 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-scripts\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.742545 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-tls-certs\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.762665 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-combined-ca-bundle\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.764579 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-secret-key\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.777765 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d9mv\" (UniqueName: \"kubernetes.io/projected/8de8baac-0d72-460d-83d5-1a96b08ce0cb-kube-api-access-7d9mv\") pod \"horizon-bc48b88d8-rt7kf\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.824498 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb98c606-aef7-46e5-8242-7ebd28d542ba-combined-ca-bundle\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.824549 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bb98c606-aef7-46e5-8242-7ebd28d542ba-horizon-secret-key\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.824618 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb98c606-aef7-46e5-8242-7ebd28d542ba-logs\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.824684 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb98c606-aef7-46e5-8242-7ebd28d542ba-config-data\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.824714 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb98c606-aef7-46e5-8242-7ebd28d542ba-scripts\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.824752 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ght9q\" (UniqueName: \"kubernetes.io/projected/bb98c606-aef7-46e5-8242-7ebd28d542ba-kube-api-access-ght9q\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.824769 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb98c606-aef7-46e5-8242-7ebd28d542ba-horizon-tls-certs\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.832642 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb98c606-aef7-46e5-8242-7ebd28d542ba-horizon-tls-certs\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.835507 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb98c606-aef7-46e5-8242-7ebd28d542ba-config-data\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.835658 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb98c606-aef7-46e5-8242-7ebd28d542ba-scripts\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.835872 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb98c606-aef7-46e5-8242-7ebd28d542ba-logs\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.840442 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bb98c606-aef7-46e5-8242-7ebd28d542ba-horizon-secret-key\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.840903 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb98c606-aef7-46e5-8242-7ebd28d542ba-combined-ca-bundle\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.858605 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ght9q\" (UniqueName: \"kubernetes.io/projected/bb98c606-aef7-46e5-8242-7ebd28d542ba-kube-api-access-ght9q\") pod \"horizon-5dd79c8f84-lx2fj\" (UID: \"bb98c606-aef7-46e5-8242-7ebd28d542ba\") " pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.885196 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.923570 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb88db36-3be1-4ed9-902e-b60277f90513" path="/var/lib/kubelet/pods/bb88db36-3be1-4ed9-902e-b60277f90513/volumes" Sep 30 18:02:15 crc kubenswrapper[4772]: I0930 18:02:15.983654 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:16 crc kubenswrapper[4772]: I0930 18:02:16.119992 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c7586e-2703-4a34-8fc1-379ee90156a9","Type":"ContainerStarted","Data":"eb162c67cd885117e2600cd23eb15751eccfb173620a9ac21ac8c7660b7cb3a2"} Sep 30 18:02:16 crc kubenswrapper[4772]: I0930 18:02:16.130855 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d437fea-02e8-48b7-8991-fb48453b6246","Type":"ContainerStarted","Data":"939b0f7b0693afa2a4b7e9f025a1e0bdbd42645bfdd2d6f4342657a8c51b567b"} Sep 30 18:02:16 crc kubenswrapper[4772]: I0930 18:02:16.627839 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bc48b88d8-rt7kf"] Sep 30 18:02:16 crc kubenswrapper[4772]: I0930 18:02:16.809538 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dd79c8f84-lx2fj"] Sep 30 18:02:16 crc kubenswrapper[4772]: I0930 18:02:16.840422 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Sep 30 18:02:16 crc kubenswrapper[4772]: I0930 18:02:16.856535 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:17 crc kubenswrapper[4772]: I0930 18:02:17.170461 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dd79c8f84-lx2fj" event={"ID":"bb98c606-aef7-46e5-8242-7ebd28d542ba","Type":"ContainerStarted","Data":"fd755c48330f3cf37a7eae63b447c1865658eecaee9dcc2e51b00cd5ce1e026c"} Sep 30 18:02:17 crc kubenswrapper[4772]: I0930 18:02:17.201924 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1d437fea-02e8-48b7-8991-fb48453b6246" containerName="glance-log" containerID="cri-o://939b0f7b0693afa2a4b7e9f025a1e0bdbd42645bfdd2d6f4342657a8c51b567b" gracePeriod=30 Sep 30 18:02:17 crc kubenswrapper[4772]: I0930 18:02:17.202080 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d437fea-02e8-48b7-8991-fb48453b6246","Type":"ContainerStarted","Data":"cb2496dd0bd95184c6178b51b8ae612c1176cb12ebeb76cda0c8d453010b27d4"} Sep 30 18:02:17 crc kubenswrapper[4772]: I0930 18:02:17.202205 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1d437fea-02e8-48b7-8991-fb48453b6246" containerName="glance-httpd" containerID="cri-o://cb2496dd0bd95184c6178b51b8ae612c1176cb12ebeb76cda0c8d453010b27d4" gracePeriod=30 Sep 30 18:02:17 crc kubenswrapper[4772]: I0930 18:02:17.210439 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc48b88d8-rt7kf" event={"ID":"8de8baac-0d72-460d-83d5-1a96b08ce0cb","Type":"ContainerStarted","Data":"efa25851249cc2c0c4a6632aa9028b1a111494bdfa3f21c5a07218b9fd663df3"} Sep 30 18:02:17 crc kubenswrapper[4772]: I0930 18:02:17.214886 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c7586e-2703-4a34-8fc1-379ee90156a9","Type":"ContainerStarted","Data":"8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813"} Sep 30 18:02:17 crc kubenswrapper[4772]: I0930 18:02:17.239897 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.239869058 podStartE2EDuration="5.239869058s" podCreationTimestamp="2025-09-30 18:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:17.232462194 +0000 UTC m=+3638.139475025" watchObservedRunningTime="2025-09-30 18:02:17.239869058 +0000 UTC m=+3638.146881889" Sep 30 18:02:17 crc kubenswrapper[4772]: I0930 18:02:17.340217 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:18 crc kubenswrapper[4772]: I0930 18:02:18.231930 4772 generic.go:334] "Generic (PLEG): container finished" podID="1d437fea-02e8-48b7-8991-fb48453b6246" containerID="cb2496dd0bd95184c6178b51b8ae612c1176cb12ebeb76cda0c8d453010b27d4" exitCode=0 Sep 30 18:02:18 crc kubenswrapper[4772]: I0930 18:02:18.232793 4772 generic.go:334] "Generic (PLEG): container finished" podID="1d437fea-02e8-48b7-8991-fb48453b6246" containerID="939b0f7b0693afa2a4b7e9f025a1e0bdbd42645bfdd2d6f4342657a8c51b567b" exitCode=143 Sep 30 18:02:18 crc kubenswrapper[4772]: I0930 18:02:18.232024 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d437fea-02e8-48b7-8991-fb48453b6246","Type":"ContainerDied","Data":"cb2496dd0bd95184c6178b51b8ae612c1176cb12ebeb76cda0c8d453010b27d4"} Sep 30 18:02:18 crc kubenswrapper[4772]: I0930 18:02:18.233182 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d437fea-02e8-48b7-8991-fb48453b6246","Type":"ContainerDied","Data":"939b0f7b0693afa2a4b7e9f025a1e0bdbd42645bfdd2d6f4342657a8c51b567b"} Sep 30 18:02:18 crc kubenswrapper[4772]: I0930 18:02:18.240388 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c7586e-2703-4a34-8fc1-379ee90156a9","Type":"ContainerStarted","Data":"229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9"} Sep 30 18:02:18 crc kubenswrapper[4772]: I0930 18:02:18.240743 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c7c7586e-2703-4a34-8fc1-379ee90156a9" containerName="glance-httpd" containerID="cri-o://229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9" gracePeriod=30 Sep 30 18:02:18 crc kubenswrapper[4772]: I0930 18:02:18.240734 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c7c7586e-2703-4a34-8fc1-379ee90156a9" containerName="glance-log" containerID="cri-o://8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813" gracePeriod=30 Sep 30 18:02:18 crc kubenswrapper[4772]: I0930 18:02:18.280979 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.280962035 podStartE2EDuration="4.280962035s" podCreationTimestamp="2025-09-30 18:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:18.279473556 +0000 UTC m=+3639.186486387" watchObservedRunningTime="2025-09-30 18:02:18.280962035 +0000 UTC m=+3639.187974866" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.047259 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.061419 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.178841 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-logs\") pod \"c7c7586e-2703-4a34-8fc1-379ee90156a9\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.178910 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-public-tls-certs\") pod \"c7c7586e-2703-4a34-8fc1-379ee90156a9\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.178972 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-ceph\") pod \"c7c7586e-2703-4a34-8fc1-379ee90156a9\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179102 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1d437fea-02e8-48b7-8991-fb48453b6246\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179202 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-scripts\") pod \"c7c7586e-2703-4a34-8fc1-379ee90156a9\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179307 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-logs\") pod \"1d437fea-02e8-48b7-8991-fb48453b6246\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179381 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-combined-ca-bundle\") pod \"1d437fea-02e8-48b7-8991-fb48453b6246\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179457 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-httpd-run\") pod \"c7c7586e-2703-4a34-8fc1-379ee90156a9\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179513 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-config-data\") pod \"c7c7586e-2703-4a34-8fc1-379ee90156a9\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179542 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-scripts\") pod \"1d437fea-02e8-48b7-8991-fb48453b6246\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179581 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-config-data\") pod \"1d437fea-02e8-48b7-8991-fb48453b6246\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179598 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-logs" (OuterVolumeSpecName: "logs") pod "c7c7586e-2703-4a34-8fc1-379ee90156a9" (UID: "c7c7586e-2703-4a34-8fc1-379ee90156a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179664 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-httpd-run\") pod \"1d437fea-02e8-48b7-8991-fb48453b6246\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179707 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhh4h\" (UniqueName: \"kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-kube-api-access-hhh4h\") pod \"1d437fea-02e8-48b7-8991-fb48453b6246\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179744 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-ceph\") pod \"1d437fea-02e8-48b7-8991-fb48453b6246\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179861 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-combined-ca-bundle\") pod \"c7c7586e-2703-4a34-8fc1-379ee90156a9\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179875 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-logs" (OuterVolumeSpecName: "logs") pod "1d437fea-02e8-48b7-8991-fb48453b6246" (UID: "1d437fea-02e8-48b7-8991-fb48453b6246"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179936 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgz6h\" (UniqueName: \"kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-kube-api-access-xgz6h\") pod \"c7c7586e-2703-4a34-8fc1-379ee90156a9\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.179967 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c7c7586e-2703-4a34-8fc1-379ee90156a9\" (UID: \"c7c7586e-2703-4a34-8fc1-379ee90156a9\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.180004 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-internal-tls-certs\") pod \"1d437fea-02e8-48b7-8991-fb48453b6246\" (UID: \"1d437fea-02e8-48b7-8991-fb48453b6246\") " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.180831 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.180861 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.182928 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1d437fea-02e8-48b7-8991-fb48453b6246" (UID: "1d437fea-02e8-48b7-8991-fb48453b6246"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.190754 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-ceph" (OuterVolumeSpecName: "ceph") pod "c7c7586e-2703-4a34-8fc1-379ee90156a9" (UID: "c7c7586e-2703-4a34-8fc1-379ee90156a9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.191199 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "1d437fea-02e8-48b7-8991-fb48453b6246" (UID: "1d437fea-02e8-48b7-8991-fb48453b6246"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.192177 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-scripts" (OuterVolumeSpecName: "scripts") pod "1d437fea-02e8-48b7-8991-fb48453b6246" (UID: "1d437fea-02e8-48b7-8991-fb48453b6246"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.195293 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7c7586e-2703-4a34-8fc1-379ee90156a9" (UID: "c7c7586e-2703-4a34-8fc1-379ee90156a9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.199675 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-kube-api-access-hhh4h" (OuterVolumeSpecName: "kube-api-access-hhh4h") pod "1d437fea-02e8-48b7-8991-fb48453b6246" (UID: "1d437fea-02e8-48b7-8991-fb48453b6246"). InnerVolumeSpecName "kube-api-access-hhh4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.218224 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-scripts" (OuterVolumeSpecName: "scripts") pod "c7c7586e-2703-4a34-8fc1-379ee90156a9" (UID: "c7c7586e-2703-4a34-8fc1-379ee90156a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.220836 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c7c7586e-2703-4a34-8fc1-379ee90156a9" (UID: "c7c7586e-2703-4a34-8fc1-379ee90156a9"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.225860 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-kube-api-access-xgz6h" (OuterVolumeSpecName: "kube-api-access-xgz6h") pod "c7c7586e-2703-4a34-8fc1-379ee90156a9" (UID: "c7c7586e-2703-4a34-8fc1-379ee90156a9"). InnerVolumeSpecName "kube-api-access-xgz6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.227488 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-ceph" (OuterVolumeSpecName: "ceph") pod "1d437fea-02e8-48b7-8991-fb48453b6246" (UID: "1d437fea-02e8-48b7-8991-fb48453b6246"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.235293 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7c7586e-2703-4a34-8fc1-379ee90156a9" (UID: "c7c7586e-2703-4a34-8fc1-379ee90156a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.264367 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1d437fea-02e8-48b7-8991-fb48453b6246" (UID: "1d437fea-02e8-48b7-8991-fb48453b6246"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.276996 4772 generic.go:334] "Generic (PLEG): container finished" podID="c7c7586e-2703-4a34-8fc1-379ee90156a9" containerID="229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9" exitCode=143 Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.277087 4772 generic.go:334] "Generic (PLEG): container finished" podID="c7c7586e-2703-4a34-8fc1-379ee90156a9" containerID="8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813" exitCode=143 Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.277216 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c7586e-2703-4a34-8fc1-379ee90156a9","Type":"ContainerDied","Data":"229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9"} Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.277353 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c7586e-2703-4a34-8fc1-379ee90156a9","Type":"ContainerDied","Data":"8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813"} Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.277382 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c7586e-2703-4a34-8fc1-379ee90156a9","Type":"ContainerDied","Data":"eb162c67cd885117e2600cd23eb15751eccfb173620a9ac21ac8c7660b7cb3a2"} Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.277263 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.278080 4772 scope.go:117] "RemoveContainer" containerID="229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286631 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286678 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286694 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c7586e-2703-4a34-8fc1-379ee90156a9-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286710 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286724 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d437fea-02e8-48b7-8991-fb48453b6246-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286737 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhh4h\" (UniqueName: \"kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-kube-api-access-hhh4h\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286913 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1d437fea-02e8-48b7-8991-fb48453b6246-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286931 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286945 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgz6h\" (UniqueName: \"kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-kube-api-access-xgz6h\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286971 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.286986 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.287005 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c7c7586e-2703-4a34-8fc1-379ee90156a9-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.299688 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d437fea-02e8-48b7-8991-fb48453b6246","Type":"ContainerDied","Data":"93376d776d84bab9d635ea658825f838cbf192cee0574e3065b1f536810f2585"} Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.299771 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.333528 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-config-data" (OuterVolumeSpecName: "config-data") pod "1d437fea-02e8-48b7-8991-fb48453b6246" (UID: "1d437fea-02e8-48b7-8991-fb48453b6246"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.338261 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c7c7586e-2703-4a34-8fc1-379ee90156a9" (UID: "c7c7586e-2703-4a34-8fc1-379ee90156a9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.351567 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.361746 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.362922 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d437fea-02e8-48b7-8991-fb48453b6246" (UID: "1d437fea-02e8-48b7-8991-fb48453b6246"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.371361 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-config-data" (OuterVolumeSpecName: "config-data") pod "c7c7586e-2703-4a34-8fc1-379ee90156a9" (UID: "c7c7586e-2703-4a34-8fc1-379ee90156a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.388695 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.388743 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.388758 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.388770 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c7586e-2703-4a34-8fc1-379ee90156a9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.388783 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d437fea-02e8-48b7-8991-fb48453b6246-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.388793 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.432106 4772 scope.go:117] "RemoveContainer" containerID="8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.471739 4772 scope.go:117] "RemoveContainer" containerID="229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9" Sep 30 18:02:19 crc kubenswrapper[4772]: E0930 18:02:19.472759 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9\": container with ID starting with 229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9 not found: ID does not exist" containerID="229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.472879 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9"} err="failed to get container status \"229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9\": rpc error: code = NotFound desc = could not find container \"229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9\": container with ID starting with 229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9 not found: ID does not exist" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.472966 4772 scope.go:117] "RemoveContainer" containerID="8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813" Sep 30 18:02:19 crc kubenswrapper[4772]: E0930 18:02:19.477413 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813\": container with ID starting with 8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813 not found: ID does not exist" containerID="8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.477847 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813"} err="failed to get container status \"8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813\": rpc error: code = NotFound desc = could not find container \"8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813\": container with ID starting with 8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813 not found: ID does not exist" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.477885 4772 scope.go:117] "RemoveContainer" containerID="229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.478983 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9"} err="failed to get container status \"229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9\": rpc error: code = NotFound desc = could not find container \"229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9\": container with ID starting with 229aed22e01560ed937304a1f06b53147b7b1ae54ac4f65b4eb227543b731bc9 not found: ID does not exist" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.479074 4772 scope.go:117] "RemoveContainer" containerID="8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.481715 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813"} err="failed to get container status \"8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813\": rpc error: code = NotFound desc = could not find container \"8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813\": container with ID starting with 8cb9b2b9928cfde4dcd7f39e062bcb77abc55848c2439474615b8516d2e42813 not found: ID does not exist" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.481743 4772 scope.go:117] "RemoveContainer" containerID="cb2496dd0bd95184c6178b51b8ae612c1176cb12ebeb76cda0c8d453010b27d4" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.523906 4772 scope.go:117] "RemoveContainer" containerID="939b0f7b0693afa2a4b7e9f025a1e0bdbd42645bfdd2d6f4342657a8c51b567b" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.619797 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.633043 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.659694 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:19 crc kubenswrapper[4772]: E0930 18:02:19.660333 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c7586e-2703-4a34-8fc1-379ee90156a9" containerName="glance-httpd" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.660367 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c7586e-2703-4a34-8fc1-379ee90156a9" containerName="glance-httpd" Sep 30 18:02:19 crc kubenswrapper[4772]: E0930 18:02:19.660402 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d437fea-02e8-48b7-8991-fb48453b6246" containerName="glance-log" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.660413 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d437fea-02e8-48b7-8991-fb48453b6246" containerName="glance-log" Sep 30 18:02:19 crc kubenswrapper[4772]: E0930 18:02:19.660429 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d437fea-02e8-48b7-8991-fb48453b6246" containerName="glance-httpd" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.660435 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d437fea-02e8-48b7-8991-fb48453b6246" containerName="glance-httpd" Sep 30 18:02:19 crc kubenswrapper[4772]: E0930 18:02:19.660455 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c7586e-2703-4a34-8fc1-379ee90156a9" containerName="glance-log" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.660461 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c7586e-2703-4a34-8fc1-379ee90156a9" containerName="glance-log" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.660707 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d437fea-02e8-48b7-8991-fb48453b6246" containerName="glance-log" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.660732 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c7586e-2703-4a34-8fc1-379ee90156a9" containerName="glance-log" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.660749 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c7586e-2703-4a34-8fc1-379ee90156a9" containerName="glance-httpd" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.660770 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d437fea-02e8-48b7-8991-fb48453b6246" containerName="glance-httpd" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.662431 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.668653 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sgs5q" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.668962 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.669127 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.669638 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.701419 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.722120 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.742389 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.791073 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.805288 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.811452 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.811846 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.821118 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.821349 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.821611 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzg7j\" (UniqueName: \"kubernetes.io/projected/7019346d-46b6-4f97-b309-58376e8a2d2a-kube-api-access-fzg7j\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.821711 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7019346d-46b6-4f97-b309-58376e8a2d2a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.821805 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7019346d-46b6-4f97-b309-58376e8a2d2a-logs\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.821872 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.821918 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-config-data\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.821992 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7019346d-46b6-4f97-b309-58376e8a2d2a-ceph\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.822322 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-scripts\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.855270 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.916574 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d437fea-02e8-48b7-8991-fb48453b6246" path="/var/lib/kubelet/pods/1d437fea-02e8-48b7-8991-fb48453b6246/volumes" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.918627 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7c7586e-2703-4a34-8fc1-379ee90156a9" path="/var/lib/kubelet/pods/c7c7586e-2703-4a34-8fc1-379ee90156a9/volumes" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936370 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8zj6\" (UniqueName: \"kubernetes.io/projected/d3ca2624-92a6-4bcf-bbb6-4780637bef02-kube-api-access-r8zj6\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936524 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-scripts\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936586 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936620 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936650 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936715 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzg7j\" (UniqueName: \"kubernetes.io/projected/7019346d-46b6-4f97-b309-58376e8a2d2a-kube-api-access-fzg7j\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936738 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936766 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ca2624-92a6-4bcf-bbb6-4780637bef02-logs\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936804 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7019346d-46b6-4f97-b309-58376e8a2d2a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936843 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936869 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7019346d-46b6-4f97-b309-58376e8a2d2a-logs\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936908 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936931 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936951 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-config-data\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.936973 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3ca2624-92a6-4bcf-bbb6-4780637bef02-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.937009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7019346d-46b6-4f97-b309-58376e8a2d2a-ceph\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.937033 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.937076 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3ca2624-92a6-4bcf-bbb6-4780637bef02-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.944300 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.944562 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7019346d-46b6-4f97-b309-58376e8a2d2a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.945502 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7019346d-46b6-4f97-b309-58376e8a2d2a-logs\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.948675 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.959554 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.967030 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzg7j\" (UniqueName: \"kubernetes.io/projected/7019346d-46b6-4f97-b309-58376e8a2d2a-kube-api-access-fzg7j\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.973967 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-scripts\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.976748 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7019346d-46b6-4f97-b309-58376e8a2d2a-ceph\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:19 crc kubenswrapper[4772]: I0930 18:02:19.989414 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7019346d-46b6-4f97-b309-58376e8a2d2a-config-data\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.042376 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8zj6\" (UniqueName: \"kubernetes.io/projected/d3ca2624-92a6-4bcf-bbb6-4780637bef02-kube-api-access-r8zj6\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.042663 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.042775 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.042815 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ca2624-92a6-4bcf-bbb6-4780637bef02-logs\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.042882 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.042949 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.043035 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3ca2624-92a6-4bcf-bbb6-4780637bef02-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.043145 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3ca2624-92a6-4bcf-bbb6-4780637bef02-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.043162 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.044654 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.047353 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ca2624-92a6-4bcf-bbb6-4780637bef02-logs\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.047910 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3ca2624-92a6-4bcf-bbb6-4780637bef02-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.053118 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.055888 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.061319 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.061715 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3ca2624-92a6-4bcf-bbb6-4780637bef02-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.069030 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7019346d-46b6-4f97-b309-58376e8a2d2a\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.077909 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8zj6\" (UniqueName: \"kubernetes.io/projected/d3ca2624-92a6-4bcf-bbb6-4780637bef02-kube-api-access-r8zj6\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.078315 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca2624-92a6-4bcf-bbb6-4780637bef02-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.111880 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.125559 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"d3ca2624-92a6-4bcf-bbb6-4780637bef02\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.138040 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4772]: I0930 18:02:20.900670 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:21 crc kubenswrapper[4772]: I0930 18:02:21.369878 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7019346d-46b6-4f97-b309-58376e8a2d2a","Type":"ContainerStarted","Data":"c7d79d18cbf27ce06a65031bd2546b1b72d84f286bafa672699953cfdea287df"} Sep 30 18:02:21 crc kubenswrapper[4772]: I0930 18:02:21.761437 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:22 crc kubenswrapper[4772]: I0930 18:02:22.045177 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Sep 30 18:02:22 crc kubenswrapper[4772]: I0930 18:02:22.105246 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Sep 30 18:02:22 crc kubenswrapper[4772]: I0930 18:02:22.394115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7019346d-46b6-4f97-b309-58376e8a2d2a","Type":"ContainerStarted","Data":"d863003079a00b4ad0d968383255823cd001e9241414114e4cedcd2978c3df45"} Sep 30 18:02:22 crc kubenswrapper[4772]: I0930 18:02:22.516190 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume2-0" Sep 30 18:02:27 crc kubenswrapper[4772]: I0930 18:02:27.460858 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d3ca2624-92a6-4bcf-bbb6-4780637bef02","Type":"ContainerStarted","Data":"9969f7c7a5af8c32f5c12163d86580e8fba032b2876df5a9885b2bc172e6408b"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.479587 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc48b88d8-rt7kf" event={"ID":"8de8baac-0d72-460d-83d5-1a96b08ce0cb","Type":"ContainerStarted","Data":"f3d00f6fc42877e79b08e1b028e0e7c8b7a535003bdb740acdfeda4b4c4de131"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.480424 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc48b88d8-rt7kf" event={"ID":"8de8baac-0d72-460d-83d5-1a96b08ce0cb","Type":"ContainerStarted","Data":"f91d99c16d576db2dcb2eb78f60530bd8eb5726f9a2b7bd8122f71b6b9516ffa"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.482738 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7019346d-46b6-4f97-b309-58376e8a2d2a","Type":"ContainerStarted","Data":"4355702971c2119bd09da5799e779b454ebeab79b9692d30fb63fb295e585808"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.486121 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dd79c8f84-lx2fj" event={"ID":"bb98c606-aef7-46e5-8242-7ebd28d542ba","Type":"ContainerStarted","Data":"75fa3e99eb537d74cf57947059c0d249ca8f55f2c24e50bc3fb1b09e50ec3e35"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.486148 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dd79c8f84-lx2fj" event={"ID":"bb98c606-aef7-46e5-8242-7ebd28d542ba","Type":"ContainerStarted","Data":"f4c44706187c91df04b2cae36279fd29ece18ff0a8ad2cfff02f5cbae0e35c87"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.491671 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8f594cf49-g777q" event={"ID":"f7d014f7-ce9c-4749-82bb-320ff97777a4","Type":"ContainerStarted","Data":"bc033ecc4eaa7aca84ef85194f5f85ba76d158a2042d4a295bb644e481f6f8d7"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.491706 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8f594cf49-g777q" event={"ID":"f7d014f7-ce9c-4749-82bb-320ff97777a4","Type":"ContainerStarted","Data":"98bc3da918ed97847dead891c58e965e117ba7a01ac2abb8d50f793dbb7f86d1"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.491732 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8f594cf49-g777q" podUID="f7d014f7-ce9c-4749-82bb-320ff97777a4" containerName="horizon-log" containerID="cri-o://98bc3da918ed97847dead891c58e965e117ba7a01ac2abb8d50f793dbb7f86d1" gracePeriod=30 Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.491842 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8f594cf49-g777q" podUID="f7d014f7-ce9c-4749-82bb-320ff97777a4" containerName="horizon" containerID="cri-o://bc033ecc4eaa7aca84ef85194f5f85ba76d158a2042d4a295bb644e481f6f8d7" gracePeriod=30 Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.495046 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d3ca2624-92a6-4bcf-bbb6-4780637bef02","Type":"ContainerStarted","Data":"8e2650211a02935cdb27c0c633168273c973147da930578a2611927372226e34"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.506520 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fdbf867b9-w8nzc" event={"ID":"e64d71ac-e536-4893-9433-c4c0154635a7","Type":"ContainerStarted","Data":"712f5ef7970e2f50a009b5c50ae7b5bce69716b81b541d26638c3445e65879b6"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.506568 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fdbf867b9-w8nzc" event={"ID":"e64d71ac-e536-4893-9433-c4c0154635a7","Type":"ContainerStarted","Data":"e64b1f4c426b50785b9ea179576d9f4e0dda44ca6ff2412510b6f17aadf3822b"} Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.506705 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fdbf867b9-w8nzc" podUID="e64d71ac-e536-4893-9433-c4c0154635a7" containerName="horizon-log" containerID="cri-o://e64b1f4c426b50785b9ea179576d9f4e0dda44ca6ff2412510b6f17aadf3822b" gracePeriod=30 Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.507010 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fdbf867b9-w8nzc" podUID="e64d71ac-e536-4893-9433-c4c0154635a7" containerName="horizon" containerID="cri-o://712f5ef7970e2f50a009b5c50ae7b5bce69716b81b541d26638c3445e65879b6" gracePeriod=30 Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.508753 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-bc48b88d8-rt7kf" podStartSLOduration=2.806595106 podStartE2EDuration="13.508720123s" podCreationTimestamp="2025-09-30 18:02:15 +0000 UTC" firstStartedPulling="2025-09-30 18:02:16.659287975 +0000 UTC m=+3637.566300806" lastFinishedPulling="2025-09-30 18:02:27.361412992 +0000 UTC m=+3648.268425823" observedRunningTime="2025-09-30 18:02:28.497929519 +0000 UTC m=+3649.404942350" watchObservedRunningTime="2025-09-30 18:02:28.508720123 +0000 UTC m=+3649.415732954" Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.526919 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5dd79c8f84-lx2fj" podStartSLOduration=3.003898607 podStartE2EDuration="13.526887641s" podCreationTimestamp="2025-09-30 18:02:15 +0000 UTC" firstStartedPulling="2025-09-30 18:02:16.875445802 +0000 UTC m=+3637.782458633" lastFinishedPulling="2025-09-30 18:02:27.398434836 +0000 UTC m=+3648.305447667" observedRunningTime="2025-09-30 18:02:28.519298441 +0000 UTC m=+3649.426311272" watchObservedRunningTime="2025-09-30 18:02:28.526887641 +0000 UTC m=+3649.433900472" Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.548538 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8f594cf49-g777q" podStartSLOduration=2.86639995 podStartE2EDuration="16.54851139s" podCreationTimestamp="2025-09-30 18:02:12 +0000 UTC" firstStartedPulling="2025-09-30 18:02:13.686263105 +0000 UTC m=+3634.593275936" lastFinishedPulling="2025-09-30 18:02:27.368374545 +0000 UTC m=+3648.275387376" observedRunningTime="2025-09-30 18:02:28.541451754 +0000 UTC m=+3649.448464585" watchObservedRunningTime="2025-09-30 18:02:28.54851139 +0000 UTC m=+3649.455524221" Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.574034 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.57401118 podStartE2EDuration="9.57401118s" podCreationTimestamp="2025-09-30 18:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:28.568000982 +0000 UTC m=+3649.475013813" watchObservedRunningTime="2025-09-30 18:02:28.57401118 +0000 UTC m=+3649.481024011" Sep 30 18:02:28 crc kubenswrapper[4772]: I0930 18:02:28.598982 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6fdbf867b9-w8nzc" podStartSLOduration=2.9766970710000002 podStartE2EDuration="16.598939116s" podCreationTimestamp="2025-09-30 18:02:12 +0000 UTC" firstStartedPulling="2025-09-30 18:02:13.727275624 +0000 UTC m=+3634.634288455" lastFinishedPulling="2025-09-30 18:02:27.349517669 +0000 UTC m=+3648.256530500" observedRunningTime="2025-09-30 18:02:28.589539829 +0000 UTC m=+3649.496552660" watchObservedRunningTime="2025-09-30 18:02:28.598939116 +0000 UTC m=+3649.505951947" Sep 30 18:02:29 crc kubenswrapper[4772]: I0930 18:02:29.550286 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d3ca2624-92a6-4bcf-bbb6-4780637bef02","Type":"ContainerStarted","Data":"be0db803b5adc09a6dc407304b952efe4c33b1fca3df0446da82c50c12c576f8"} Sep 30 18:02:29 crc kubenswrapper[4772]: I0930 18:02:29.588474 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.588450467 podStartE2EDuration="10.588450467s" podCreationTimestamp="2025-09-30 18:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:29.572374284 +0000 UTC m=+3650.479387115" watchObservedRunningTime="2025-09-30 18:02:29.588450467 +0000 UTC m=+3650.495463298" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.112773 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.112840 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.139114 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.139412 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.168651 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.170787 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.175756 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.179734 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.560190 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.560273 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.560287 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 18:02:30 crc kubenswrapper[4772]: I0930 18:02:30.560295 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:32 crc kubenswrapper[4772]: I0930 18:02:32.788216 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:32 crc kubenswrapper[4772]: I0930 18:02:32.788891 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:34 crc kubenswrapper[4772]: I0930 18:02:34.404837 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 18:02:34 crc kubenswrapper[4772]: I0930 18:02:34.424581 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 18:02:35 crc kubenswrapper[4772]: I0930 18:02:35.886792 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:35 crc kubenswrapper[4772]: I0930 18:02:35.887027 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:35 crc kubenswrapper[4772]: I0930 18:02:35.984571 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:35 crc kubenswrapper[4772]: I0930 18:02:35.984620 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:38 crc kubenswrapper[4772]: I0930 18:02:38.660496 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:02:38 crc kubenswrapper[4772]: I0930 18:02:38.660857 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:02:48 crc kubenswrapper[4772]: I0930 18:02:48.155452 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:48 crc kubenswrapper[4772]: I0930 18:02:48.170881 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:50 crc kubenswrapper[4772]: I0930 18:02:50.042309 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5dd79c8f84-lx2fj" Sep 30 18:02:50 crc kubenswrapper[4772]: I0930 18:02:50.055873 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:02:50 crc kubenswrapper[4772]: I0930 18:02:50.141613 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bc48b88d8-rt7kf"] Sep 30 18:02:50 crc kubenswrapper[4772]: I0930 18:02:50.825388 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bc48b88d8-rt7kf" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon-log" containerID="cri-o://f91d99c16d576db2dcb2eb78f60530bd8eb5726f9a2b7bd8122f71b6b9516ffa" gracePeriod=30 Sep 30 18:02:50 crc kubenswrapper[4772]: I0930 18:02:50.825901 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bc48b88d8-rt7kf" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon" containerID="cri-o://f3d00f6fc42877e79b08e1b028e0e7c8b7a535003bdb740acdfeda4b4c4de131" gracePeriod=30 Sep 30 18:02:51 crc kubenswrapper[4772]: I0930 18:02:51.836978 4772 generic.go:334] "Generic (PLEG): container finished" podID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerID="f3d00f6fc42877e79b08e1b028e0e7c8b7a535003bdb740acdfeda4b4c4de131" exitCode=0 Sep 30 18:02:51 crc kubenswrapper[4772]: I0930 18:02:51.837158 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc48b88d8-rt7kf" event={"ID":"8de8baac-0d72-460d-83d5-1a96b08ce0cb","Type":"ContainerDied","Data":"f3d00f6fc42877e79b08e1b028e0e7c8b7a535003bdb740acdfeda4b4c4de131"} Sep 30 18:02:52 crc kubenswrapper[4772]: I0930 18:02:52.276316 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:52 crc kubenswrapper[4772]: I0930 18:02:52.280130 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:55 crc kubenswrapper[4772]: I0930 18:02:55.886956 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-bc48b88d8-rt7kf" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.11:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.11:8443: connect: connection refused" Sep 30 18:02:58 crc kubenswrapper[4772]: I0930 18:02:58.953411 4772 generic.go:334] "Generic (PLEG): container finished" podID="f7d014f7-ce9c-4749-82bb-320ff97777a4" containerID="bc033ecc4eaa7aca84ef85194f5f85ba76d158a2042d4a295bb644e481f6f8d7" exitCode=137 Sep 30 18:02:58 crc kubenswrapper[4772]: I0930 18:02:58.954371 4772 generic.go:334] "Generic (PLEG): container finished" podID="f7d014f7-ce9c-4749-82bb-320ff97777a4" containerID="98bc3da918ed97847dead891c58e965e117ba7a01ac2abb8d50f793dbb7f86d1" exitCode=137 Sep 30 18:02:58 crc kubenswrapper[4772]: I0930 18:02:58.953483 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8f594cf49-g777q" event={"ID":"f7d014f7-ce9c-4749-82bb-320ff97777a4","Type":"ContainerDied","Data":"bc033ecc4eaa7aca84ef85194f5f85ba76d158a2042d4a295bb644e481f6f8d7"} Sep 30 18:02:58 crc kubenswrapper[4772]: I0930 18:02:58.954478 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8f594cf49-g777q" event={"ID":"f7d014f7-ce9c-4749-82bb-320ff97777a4","Type":"ContainerDied","Data":"98bc3da918ed97847dead891c58e965e117ba7a01ac2abb8d50f793dbb7f86d1"} Sep 30 18:02:58 crc kubenswrapper[4772]: I0930 18:02:58.954496 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8f594cf49-g777q" event={"ID":"f7d014f7-ce9c-4749-82bb-320ff97777a4","Type":"ContainerDied","Data":"7e5fc2b3a7aacc87836be2cc4669d1d5ddc7fd02fb379a52c9d28eaf3f69c631"} Sep 30 18:02:58 crc kubenswrapper[4772]: I0930 18:02:58.954512 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e5fc2b3a7aacc87836be2cc4669d1d5ddc7fd02fb379a52c9d28eaf3f69c631" Sep 30 18:02:58 crc kubenswrapper[4772]: I0930 18:02:58.961080 4772 generic.go:334] "Generic (PLEG): container finished" podID="e64d71ac-e536-4893-9433-c4c0154635a7" containerID="712f5ef7970e2f50a009b5c50ae7b5bce69716b81b541d26638c3445e65879b6" exitCode=137 Sep 30 18:02:58 crc kubenswrapper[4772]: I0930 18:02:58.961120 4772 generic.go:334] "Generic (PLEG): container finished" podID="e64d71ac-e536-4893-9433-c4c0154635a7" containerID="e64b1f4c426b50785b9ea179576d9f4e0dda44ca6ff2412510b6f17aadf3822b" exitCode=137 Sep 30 18:02:58 crc kubenswrapper[4772]: I0930 18:02:58.961151 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fdbf867b9-w8nzc" event={"ID":"e64d71ac-e536-4893-9433-c4c0154635a7","Type":"ContainerDied","Data":"712f5ef7970e2f50a009b5c50ae7b5bce69716b81b541d26638c3445e65879b6"} Sep 30 18:02:58 crc kubenswrapper[4772]: I0930 18:02:58.961191 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fdbf867b9-w8nzc" event={"ID":"e64d71ac-e536-4893-9433-c4c0154635a7","Type":"ContainerDied","Data":"e64b1f4c426b50785b9ea179576d9f4e0dda44ca6ff2412510b6f17aadf3822b"} Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.032551 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.040221 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.094671 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7d014f7-ce9c-4749-82bb-320ff97777a4-logs\") pod \"f7d014f7-ce9c-4749-82bb-320ff97777a4\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.094785 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-config-data\") pod \"e64d71ac-e536-4893-9433-c4c0154635a7\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.094876 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e64d71ac-e536-4893-9433-c4c0154635a7-horizon-secret-key\") pod \"e64d71ac-e536-4893-9433-c4c0154635a7\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.094909 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtjnx\" (UniqueName: \"kubernetes.io/projected/e64d71ac-e536-4893-9433-c4c0154635a7-kube-api-access-vtjnx\") pod \"e64d71ac-e536-4893-9433-c4c0154635a7\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.095274 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f7d014f7-ce9c-4749-82bb-320ff97777a4-horizon-secret-key\") pod \"f7d014f7-ce9c-4749-82bb-320ff97777a4\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.095367 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b72wl\" (UniqueName: \"kubernetes.io/projected/f7d014f7-ce9c-4749-82bb-320ff97777a4-kube-api-access-b72wl\") pod \"f7d014f7-ce9c-4749-82bb-320ff97777a4\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.095432 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-config-data\") pod \"f7d014f7-ce9c-4749-82bb-320ff97777a4\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.095543 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-scripts\") pod \"f7d014f7-ce9c-4749-82bb-320ff97777a4\" (UID: \"f7d014f7-ce9c-4749-82bb-320ff97777a4\") " Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.095649 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e64d71ac-e536-4893-9433-c4c0154635a7-logs\") pod \"e64d71ac-e536-4893-9433-c4c0154635a7\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.095719 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-scripts\") pod \"e64d71ac-e536-4893-9433-c4c0154635a7\" (UID: \"e64d71ac-e536-4893-9433-c4c0154635a7\") " Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.096088 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d014f7-ce9c-4749-82bb-320ff97777a4-logs" (OuterVolumeSpecName: "logs") pod "f7d014f7-ce9c-4749-82bb-320ff97777a4" (UID: "f7d014f7-ce9c-4749-82bb-320ff97777a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.096765 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64d71ac-e536-4893-9433-c4c0154635a7-logs" (OuterVolumeSpecName: "logs") pod "e64d71ac-e536-4893-9433-c4c0154635a7" (UID: "e64d71ac-e536-4893-9433-c4c0154635a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.097132 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7d014f7-ce9c-4749-82bb-320ff97777a4-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.097146 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e64d71ac-e536-4893-9433-c4c0154635a7-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.104123 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64d71ac-e536-4893-9433-c4c0154635a7-kube-api-access-vtjnx" (OuterVolumeSpecName: "kube-api-access-vtjnx") pod "e64d71ac-e536-4893-9433-c4c0154635a7" (UID: "e64d71ac-e536-4893-9433-c4c0154635a7"). InnerVolumeSpecName "kube-api-access-vtjnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.109341 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64d71ac-e536-4893-9433-c4c0154635a7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e64d71ac-e536-4893-9433-c4c0154635a7" (UID: "e64d71ac-e536-4893-9433-c4c0154635a7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.119477 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d014f7-ce9c-4749-82bb-320ff97777a4-kube-api-access-b72wl" (OuterVolumeSpecName: "kube-api-access-b72wl") pod "f7d014f7-ce9c-4749-82bb-320ff97777a4" (UID: "f7d014f7-ce9c-4749-82bb-320ff97777a4"). InnerVolumeSpecName "kube-api-access-b72wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.119668 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7d014f7-ce9c-4749-82bb-320ff97777a4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f7d014f7-ce9c-4749-82bb-320ff97777a4" (UID: "f7d014f7-ce9c-4749-82bb-320ff97777a4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.132222 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-scripts" (OuterVolumeSpecName: "scripts") pod "f7d014f7-ce9c-4749-82bb-320ff97777a4" (UID: "f7d014f7-ce9c-4749-82bb-320ff97777a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.132315 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-scripts" (OuterVolumeSpecName: "scripts") pod "e64d71ac-e536-4893-9433-c4c0154635a7" (UID: "e64d71ac-e536-4893-9433-c4c0154635a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.132358 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-config-data" (OuterVolumeSpecName: "config-data") pod "f7d014f7-ce9c-4749-82bb-320ff97777a4" (UID: "f7d014f7-ce9c-4749-82bb-320ff97777a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.141964 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-config-data" (OuterVolumeSpecName: "config-data") pod "e64d71ac-e536-4893-9433-c4c0154635a7" (UID: "e64d71ac-e536-4893-9433-c4c0154635a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.200026 4772 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e64d71ac-e536-4893-9433-c4c0154635a7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.200107 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtjnx\" (UniqueName: \"kubernetes.io/projected/e64d71ac-e536-4893-9433-c4c0154635a7-kube-api-access-vtjnx\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.200125 4772 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f7d014f7-ce9c-4749-82bb-320ff97777a4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.200135 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b72wl\" (UniqueName: \"kubernetes.io/projected/f7d014f7-ce9c-4749-82bb-320ff97777a4-kube-api-access-b72wl\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.200151 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.200165 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7d014f7-ce9c-4749-82bb-320ff97777a4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.200177 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.200187 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e64d71ac-e536-4893-9433-c4c0154635a7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.977164 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fdbf867b9-w8nzc" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.977130 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fdbf867b9-w8nzc" event={"ID":"e64d71ac-e536-4893-9433-c4c0154635a7","Type":"ContainerDied","Data":"d3767dbeb39f91c2efdee34b1e0a6b532809769a2b34857c24980f3e1db7425e"} Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.977784 4772 scope.go:117] "RemoveContainer" containerID="712f5ef7970e2f50a009b5c50ae7b5bce69716b81b541d26638c3445e65879b6" Sep 30 18:02:59 crc kubenswrapper[4772]: I0930 18:02:59.982526 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f594cf49-g777q" Sep 30 18:03:00 crc kubenswrapper[4772]: I0930 18:03:00.020335 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fdbf867b9-w8nzc"] Sep 30 18:03:00 crc kubenswrapper[4772]: I0930 18:03:00.035775 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6fdbf867b9-w8nzc"] Sep 30 18:03:00 crc kubenswrapper[4772]: I0930 18:03:00.045660 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8f594cf49-g777q"] Sep 30 18:03:00 crc kubenswrapper[4772]: I0930 18:03:00.054515 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8f594cf49-g777q"] Sep 30 18:03:00 crc kubenswrapper[4772]: I0930 18:03:00.182154 4772 scope.go:117] "RemoveContainer" containerID="e64b1f4c426b50785b9ea179576d9f4e0dda44ca6ff2412510b6f17aadf3822b" Sep 30 18:03:01 crc kubenswrapper[4772]: I0930 18:03:01.916786 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64d71ac-e536-4893-9433-c4c0154635a7" path="/var/lib/kubelet/pods/e64d71ac-e536-4893-9433-c4c0154635a7/volumes" Sep 30 18:03:01 crc kubenswrapper[4772]: I0930 18:03:01.917923 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7d014f7-ce9c-4749-82bb-320ff97777a4" path="/var/lib/kubelet/pods/f7d014f7-ce9c-4749-82bb-320ff97777a4/volumes" Sep 30 18:03:05 crc kubenswrapper[4772]: I0930 18:03:05.886545 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-bc48b88d8-rt7kf" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.11:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.11:8443: connect: connection refused" Sep 30 18:03:08 crc kubenswrapper[4772]: I0930 18:03:08.655279 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:03:08 crc kubenswrapper[4772]: I0930 18:03:08.656106 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:03:15 crc kubenswrapper[4772]: I0930 18:03:15.887216 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-bc48b88d8-rt7kf" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.11:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.11:8443: connect: connection refused" Sep 30 18:03:15 crc kubenswrapper[4772]: I0930 18:03:15.887818 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.218598 4772 generic.go:334] "Generic (PLEG): container finished" podID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerID="f91d99c16d576db2dcb2eb78f60530bd8eb5726f9a2b7bd8122f71b6b9516ffa" exitCode=137 Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.218703 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc48b88d8-rt7kf" event={"ID":"8de8baac-0d72-460d-83d5-1a96b08ce0cb","Type":"ContainerDied","Data":"f91d99c16d576db2dcb2eb78f60530bd8eb5726f9a2b7bd8122f71b6b9516ffa"} Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.218998 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc48b88d8-rt7kf" event={"ID":"8de8baac-0d72-460d-83d5-1a96b08ce0cb","Type":"ContainerDied","Data":"efa25851249cc2c0c4a6632aa9028b1a111494bdfa3f21c5a07218b9fd663df3"} Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.219013 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efa25851249cc2c0c4a6632aa9028b1a111494bdfa3f21c5a07218b9fd663df3" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.273148 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.433461 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-combined-ca-bundle\") pod \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.433692 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d9mv\" (UniqueName: \"kubernetes.io/projected/8de8baac-0d72-460d-83d5-1a96b08ce0cb-kube-api-access-7d9mv\") pod \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.433753 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-tls-certs\") pod \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.433834 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-secret-key\") pod \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.433869 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8de8baac-0d72-460d-83d5-1a96b08ce0cb-logs\") pod \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.433897 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-config-data\") pod \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.434045 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-scripts\") pod \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\" (UID: \"8de8baac-0d72-460d-83d5-1a96b08ce0cb\") " Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.435015 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de8baac-0d72-460d-83d5-1a96b08ce0cb-logs" (OuterVolumeSpecName: "logs") pod "8de8baac-0d72-460d-83d5-1a96b08ce0cb" (UID: "8de8baac-0d72-460d-83d5-1a96b08ce0cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.440269 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8de8baac-0d72-460d-83d5-1a96b08ce0cb" (UID: "8de8baac-0d72-460d-83d5-1a96b08ce0cb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.440704 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de8baac-0d72-460d-83d5-1a96b08ce0cb-kube-api-access-7d9mv" (OuterVolumeSpecName: "kube-api-access-7d9mv") pod "8de8baac-0d72-460d-83d5-1a96b08ce0cb" (UID: "8de8baac-0d72-460d-83d5-1a96b08ce0cb"). InnerVolumeSpecName "kube-api-access-7d9mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.462729 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-scripts" (OuterVolumeSpecName: "scripts") pod "8de8baac-0d72-460d-83d5-1a96b08ce0cb" (UID: "8de8baac-0d72-460d-83d5-1a96b08ce0cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.465504 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-config-data" (OuterVolumeSpecName: "config-data") pod "8de8baac-0d72-460d-83d5-1a96b08ce0cb" (UID: "8de8baac-0d72-460d-83d5-1a96b08ce0cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.469125 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8de8baac-0d72-460d-83d5-1a96b08ce0cb" (UID: "8de8baac-0d72-460d-83d5-1a96b08ce0cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.498080 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "8de8baac-0d72-460d-83d5-1a96b08ce0cb" (UID: "8de8baac-0d72-460d-83d5-1a96b08ce0cb"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.536341 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d9mv\" (UniqueName: \"kubernetes.io/projected/8de8baac-0d72-460d-83d5-1a96b08ce0cb-kube-api-access-7d9mv\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.536372 4772 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.536382 4772 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.536390 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8de8baac-0d72-460d-83d5-1a96b08ce0cb-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.536446 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.536455 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8de8baac-0d72-460d-83d5-1a96b08ce0cb-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:21 crc kubenswrapper[4772]: I0930 18:03:21.536464 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de8baac-0d72-460d-83d5-1a96b08ce0cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:22 crc kubenswrapper[4772]: I0930 18:03:22.235480 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc48b88d8-rt7kf" Sep 30 18:03:22 crc kubenswrapper[4772]: I0930 18:03:22.264470 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bc48b88d8-rt7kf"] Sep 30 18:03:22 crc kubenswrapper[4772]: I0930 18:03:22.274340 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bc48b88d8-rt7kf"] Sep 30 18:03:23 crc kubenswrapper[4772]: I0930 18:03:23.910724 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" path="/var/lib/kubelet/pods/8de8baac-0d72-460d-83d5-1a96b08ce0cb/volumes" Sep 30 18:03:28 crc kubenswrapper[4772]: I0930 18:03:28.783621 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:03:28 crc kubenswrapper[4772]: I0930 18:03:28.784846 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="prometheus" containerID="cri-o://fa8fcb56ba464764db6ffb6fc836f13e17f54d29518d27ea98e40538d0cc74b2" gracePeriod=600 Sep 30 18:03:28 crc kubenswrapper[4772]: I0930 18:03:28.784932 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="thanos-sidecar" containerID="cri-o://ee97b6f54d1fdcfcf030810826727b7473e4082a10284499b1dc90a1ea2af6a8" gracePeriod=600 Sep 30 18:03:28 crc kubenswrapper[4772]: I0930 18:03:28.784953 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="config-reloader" containerID="cri-o://16e58bdbd634a993e4df41c064b6189911fd179943f9d3088eef4d2150d5f963" gracePeriod=600 Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.305868 4772 generic.go:334] "Generic (PLEG): container finished" podID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerID="ee97b6f54d1fdcfcf030810826727b7473e4082a10284499b1dc90a1ea2af6a8" exitCode=0 Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.305898 4772 generic.go:334] "Generic (PLEG): container finished" podID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerID="16e58bdbd634a993e4df41c064b6189911fd179943f9d3088eef4d2150d5f963" exitCode=0 Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.305905 4772 generic.go:334] "Generic (PLEG): container finished" podID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerID="fa8fcb56ba464764db6ffb6fc836f13e17f54d29518d27ea98e40538d0cc74b2" exitCode=0 Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.305925 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b","Type":"ContainerDied","Data":"ee97b6f54d1fdcfcf030810826727b7473e4082a10284499b1dc90a1ea2af6a8"} Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.305952 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b","Type":"ContainerDied","Data":"16e58bdbd634a993e4df41c064b6189911fd179943f9d3088eef4d2150d5f963"} Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.305964 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b","Type":"ContainerDied","Data":"fa8fcb56ba464764db6ffb6fc836f13e17f54d29518d27ea98e40538d0cc74b2"} Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.774895 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.967182 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config-out\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.967262 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-tls-assets\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.968135 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.968206 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-prometheus-metric-storage-rulefiles-0\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.968265 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhpxf\" (UniqueName: \"kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-kube-api-access-zhpxf\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.968312 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.969090 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.969360 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.969403 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-secret-combined-ca-bundle\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.969537 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.969616 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.969678 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-thanos-prometheus-http-client-file\") pod \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\" (UID: \"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b\") " Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.970975 4772 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.975779 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.976280 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config" (OuterVolumeSpecName: "config") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.976736 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.977354 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.978929 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config-out" (OuterVolumeSpecName: "config-out") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.979226 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-kube-api-access-zhpxf" (OuterVolumeSpecName: "kube-api-access-zhpxf") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "kube-api-access-zhpxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.979279 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.995835 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:29 crc kubenswrapper[4772]: I0930 18:03:29.998980 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.081529 4772 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.081602 4772 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.081615 4772 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.081676 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") on node \"crc\" " Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.081693 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhpxf\" (UniqueName: \"kubernetes.io/projected/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-kube-api-access-zhpxf\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.081707 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.081719 4772 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.081737 4772 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.081829 4772 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.088414 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config" (OuterVolumeSpecName: "web-config") pod "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" (UID: "8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.145501 4772 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.145731 4772 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55") on node "crc" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.185786 4772 reconciler_common.go:293] "Volume detached for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.185838 4772 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.320278 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b","Type":"ContainerDied","Data":"5ee4cfeda89ce98d4146d01afc038d51fa93ce363bc9c7872d4b719b05da6a61"} Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.320593 4772 scope.go:117] "RemoveContainer" containerID="ee97b6f54d1fdcfcf030810826727b7473e4082a10284499b1dc90a1ea2af6a8" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.320358 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.359226 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.368238 4772 scope.go:117] "RemoveContainer" containerID="16e58bdbd634a993e4df41c064b6189911fd179943f9d3088eef4d2150d5f963" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.369653 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394279 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:03:30 crc kubenswrapper[4772]: E0930 18:03:30.394723 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="prometheus" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394743 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="prometheus" Sep 30 18:03:30 crc kubenswrapper[4772]: E0930 18:03:30.394752 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64d71ac-e536-4893-9433-c4c0154635a7" containerName="horizon-log" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394759 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64d71ac-e536-4893-9433-c4c0154635a7" containerName="horizon-log" Sep 30 18:03:30 crc kubenswrapper[4772]: E0930 18:03:30.394774 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="thanos-sidecar" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394781 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="thanos-sidecar" Sep 30 18:03:30 crc kubenswrapper[4772]: E0930 18:03:30.394795 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394801 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon" Sep 30 18:03:30 crc kubenswrapper[4772]: E0930 18:03:30.394813 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d014f7-ce9c-4749-82bb-320ff97777a4" containerName="horizon-log" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394819 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d014f7-ce9c-4749-82bb-320ff97777a4" containerName="horizon-log" Sep 30 18:03:30 crc kubenswrapper[4772]: E0930 18:03:30.394836 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="config-reloader" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394842 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="config-reloader" Sep 30 18:03:30 crc kubenswrapper[4772]: E0930 18:03:30.394853 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64d71ac-e536-4893-9433-c4c0154635a7" containerName="horizon" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394858 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64d71ac-e536-4893-9433-c4c0154635a7" containerName="horizon" Sep 30 18:03:30 crc kubenswrapper[4772]: E0930 18:03:30.394883 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon-log" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394891 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon-log" Sep 30 18:03:30 crc kubenswrapper[4772]: E0930 18:03:30.394902 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d014f7-ce9c-4749-82bb-320ff97777a4" containerName="horizon" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394908 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d014f7-ce9c-4749-82bb-320ff97777a4" containerName="horizon" Sep 30 18:03:30 crc kubenswrapper[4772]: E0930 18:03:30.394916 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="init-config-reloader" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.394922 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="init-config-reloader" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.395148 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64d71ac-e536-4893-9433-c4c0154635a7" containerName="horizon" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.395163 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="prometheus" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.395169 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="config-reloader" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.395179 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon-log" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.395194 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="thanos-sidecar" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.395202 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d014f7-ce9c-4749-82bb-320ff97777a4" containerName="horizon" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.395218 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de8baac-0d72-460d-83d5-1a96b08ce0cb" containerName="horizon" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.395230 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64d71ac-e536-4893-9433-c4c0154635a7" containerName="horizon-log" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.395236 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d014f7-ce9c-4749-82bb-320ff97777a4" containerName="horizon-log" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.396904 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.405049 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.405242 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-phjm5" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.405374 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.411268 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.412350 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.415382 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.415796 4772 scope.go:117] "RemoveContainer" containerID="fa8fcb56ba464764db6ffb6fc836f13e17f54d29518d27ea98e40538d0cc74b2" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.434905 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.456637 4772 scope.go:117] "RemoveContainer" containerID="2f911176d380f31af114254402121761cf351d24a2ee740826e7f2c17aaf9f09" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.492845 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.492926 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.493038 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8881ab23-9d2d-4563-b838-7b4583805e4f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.493107 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-config\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.493147 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.493207 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8881ab23-9d2d-4563-b838-7b4583805e4f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.493232 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.493274 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.493302 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.493321 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q8mf\" (UniqueName: \"kubernetes.io/projected/8881ab23-9d2d-4563-b838-7b4583805e4f-kube-api-access-5q8mf\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.493343 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8881ab23-9d2d-4563-b838-7b4583805e4f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596011 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8881ab23-9d2d-4563-b838-7b4583805e4f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596093 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-config\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596135 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596215 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8881ab23-9d2d-4563-b838-7b4583805e4f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596242 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596303 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596333 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596362 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q8mf\" (UniqueName: \"kubernetes.io/projected/8881ab23-9d2d-4563-b838-7b4583805e4f-kube-api-access-5q8mf\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596395 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8881ab23-9d2d-4563-b838-7b4583805e4f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596448 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.596496 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.598512 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8881ab23-9d2d-4563-b838-7b4583805e4f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.601305 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.601361 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a4cd7d25308c8c5d6d110405c655d59b160fe777a0b1c5faa198b785c403f1cc/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.602257 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.602318 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.603224 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.603809 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.604362 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8881ab23-9d2d-4563-b838-7b4583805e4f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.604928 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8881ab23-9d2d-4563-b838-7b4583805e4f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.607620 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.609408 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8881ab23-9d2d-4563-b838-7b4583805e4f-config\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.622155 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q8mf\" (UniqueName: \"kubernetes.io/projected/8881ab23-9d2d-4563-b838-7b4583805e4f-kube-api-access-5q8mf\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.650380 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65455cbe-b0a4-4f3f-8a65-992b3c43de55\") pod \"prometheus-metric-storage-0\" (UID: \"8881ab23-9d2d-4563-b838-7b4583805e4f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:30 crc kubenswrapper[4772]: I0930 18:03:30.728611 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:31 crc kubenswrapper[4772]: I0930 18:03:31.288275 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:03:31 crc kubenswrapper[4772]: I0930 18:03:31.336135 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8881ab23-9d2d-4563-b838-7b4583805e4f","Type":"ContainerStarted","Data":"2c7493128de68193a6cfa090370d8c1899653575ad8771f2aaa42f146f8fda77"} Sep 30 18:03:31 crc kubenswrapper[4772]: I0930 18:03:31.945925 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" path="/var/lib/kubelet/pods/8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b/volumes" Sep 30 18:03:32 crc kubenswrapper[4772]: I0930 18:03:32.692381 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="8057ad6a-2ffa-4c13-acc8-0846c3ad0d3b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.133:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 18:03:35 crc kubenswrapper[4772]: I0930 18:03:35.382475 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8881ab23-9d2d-4563-b838-7b4583805e4f","Type":"ContainerStarted","Data":"2bd9c046beabe20794861aebdc35729f6468f603c4485cddcec9ef858c2bdb2a"} Sep 30 18:03:38 crc kubenswrapper[4772]: I0930 18:03:38.655335 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:03:38 crc kubenswrapper[4772]: I0930 18:03:38.656927 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:03:38 crc kubenswrapper[4772]: I0930 18:03:38.657078 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 18:03:38 crc kubenswrapper[4772]: I0930 18:03:38.658039 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:03:38 crc kubenswrapper[4772]: I0930 18:03:38.658219 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" gracePeriod=600 Sep 30 18:03:38 crc kubenswrapper[4772]: E0930 18:03:38.788742 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:03:39 crc kubenswrapper[4772]: I0930 18:03:39.436818 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" exitCode=0 Sep 30 18:03:39 crc kubenswrapper[4772]: I0930 18:03:39.436896 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb"} Sep 30 18:03:39 crc kubenswrapper[4772]: I0930 18:03:39.437035 4772 scope.go:117] "RemoveContainer" containerID="3833c2385cfa1ee8eb7c08c4dcf01f6d652b485c7a29505227f3ab3c212e162a" Sep 30 18:03:39 crc kubenswrapper[4772]: I0930 18:03:39.437572 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:03:39 crc kubenswrapper[4772]: E0930 18:03:39.438047 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:03:43 crc kubenswrapper[4772]: I0930 18:03:43.497683 4772 generic.go:334] "Generic (PLEG): container finished" podID="8881ab23-9d2d-4563-b838-7b4583805e4f" containerID="2bd9c046beabe20794861aebdc35729f6468f603c4485cddcec9ef858c2bdb2a" exitCode=0 Sep 30 18:03:43 crc kubenswrapper[4772]: I0930 18:03:43.497772 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8881ab23-9d2d-4563-b838-7b4583805e4f","Type":"ContainerDied","Data":"2bd9c046beabe20794861aebdc35729f6468f603c4485cddcec9ef858c2bdb2a"} Sep 30 18:03:44 crc kubenswrapper[4772]: I0930 18:03:44.509973 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8881ab23-9d2d-4563-b838-7b4583805e4f","Type":"ContainerStarted","Data":"1109f8b666550d888cbc8e61b5b99ec939b41bf63ce9d59f8c117e7ef355180d"} Sep 30 18:03:47 crc kubenswrapper[4772]: I0930 18:03:47.538326 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8881ab23-9d2d-4563-b838-7b4583805e4f","Type":"ContainerStarted","Data":"9e3abf6b1977fc8da04ea2f2f575785ef8332cb458d34a1fa6a9dbc277766ad9"} Sep 30 18:03:47 crc kubenswrapper[4772]: I0930 18:03:47.538800 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8881ab23-9d2d-4563-b838-7b4583805e4f","Type":"ContainerStarted","Data":"ed051a53d9ed19fb8ca75497b095fc2c46c6688d877dbffbb606fd7d0db47805"} Sep 30 18:03:47 crc kubenswrapper[4772]: I0930 18:03:47.596700 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.596679022 podStartE2EDuration="17.596679022s" podCreationTimestamp="2025-09-30 18:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:03:47.587798249 +0000 UTC m=+3728.494811080" watchObservedRunningTime="2025-09-30 18:03:47.596679022 +0000 UTC m=+3728.503691853" Sep 30 18:03:50 crc kubenswrapper[4772]: I0930 18:03:50.730598 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 18:03:50 crc kubenswrapper[4772]: I0930 18:03:50.898745 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:03:50 crc kubenswrapper[4772]: E0930 18:03:50.899017 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:04:00 crc kubenswrapper[4772]: I0930 18:04:00.729707 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 18:04:00 crc kubenswrapper[4772]: I0930 18:04:00.741671 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 18:04:01 crc kubenswrapper[4772]: I0930 18:04:01.693404 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 18:04:05 crc kubenswrapper[4772]: I0930 18:04:05.903818 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:04:05 crc kubenswrapper[4772]: E0930 18:04:05.904758 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:04:16 crc kubenswrapper[4772]: E0930 18:04:16.009606 4772 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.115:35648->38.102.83.115:35633: write tcp 38.102.83.115:35648->38.102.83.115:35633: write: broken pipe Sep 30 18:04:20 crc kubenswrapper[4772]: I0930 18:04:20.898562 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:04:20 crc kubenswrapper[4772]: E0930 18:04:20.899326 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:04:21 crc kubenswrapper[4772]: I0930 18:04:21.989658 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.304186 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.306542 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.310067 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-t7k22" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.310795 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.310925 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.310976 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.357286 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.460612 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.460791 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.460850 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54cht\" (UniqueName: \"kubernetes.io/projected/10f9355c-b2c3-4893-86db-91551575a21e-kube-api-access-54cht\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.460912 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.460941 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.461283 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-config-data\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.461417 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.461851 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.461962 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.564863 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.564929 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.564964 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.565049 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.565106 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54cht\" (UniqueName: \"kubernetes.io/projected/10f9355c-b2c3-4893-86db-91551575a21e-kube-api-access-54cht\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.565160 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.565177 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.565201 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-config-data\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.565228 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.566115 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.566577 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.566704 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.567239 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.567431 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-config-data\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.575797 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.580412 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.581025 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.585165 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54cht\" (UniqueName: \"kubernetes.io/projected/10f9355c-b2c3-4893-86db-91551575a21e-kube-api-access-54cht\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.596935 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " pod="openstack/tempest-tests-tempest" Sep 30 18:04:30 crc kubenswrapper[4772]: I0930 18:04:30.646029 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 18:04:31 crc kubenswrapper[4772]: I0930 18:04:31.146114 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 18:04:31 crc kubenswrapper[4772]: I0930 18:04:31.154049 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:04:32 crc kubenswrapper[4772]: I0930 18:04:32.059289 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"10f9355c-b2c3-4893-86db-91551575a21e","Type":"ContainerStarted","Data":"bc036abe7e0db14bc785ff0de32416a3930e7ab5149c3c53649b01dec24ac112"} Sep 30 18:04:32 crc kubenswrapper[4772]: I0930 18:04:32.900418 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:04:32 crc kubenswrapper[4772]: E0930 18:04:32.900835 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:04:41 crc kubenswrapper[4772]: I0930 18:04:41.498303 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 18:04:43 crc kubenswrapper[4772]: I0930 18:04:43.219246 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"10f9355c-b2c3-4893-86db-91551575a21e","Type":"ContainerStarted","Data":"53f755913376d077923ed246a5ea241b2b43eb5d27ad6bc405aefa9115e3534e"} Sep 30 18:04:43 crc kubenswrapper[4772]: I0930 18:04:43.253565 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.913519188 podStartE2EDuration="14.2535368s" podCreationTimestamp="2025-09-30 18:04:29 +0000 UTC" firstStartedPulling="2025-09-30 18:04:31.153824426 +0000 UTC m=+3772.060837247" lastFinishedPulling="2025-09-30 18:04:41.493842028 +0000 UTC m=+3782.400854859" observedRunningTime="2025-09-30 18:04:43.244283347 +0000 UTC m=+3784.151296198" watchObservedRunningTime="2025-09-30 18:04:43.2535368 +0000 UTC m=+3784.160549631" Sep 30 18:04:46 crc kubenswrapper[4772]: I0930 18:04:46.899324 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:04:46 crc kubenswrapper[4772]: E0930 18:04:46.900644 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:04:57 crc kubenswrapper[4772]: I0930 18:04:57.898236 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:04:57 crc kubenswrapper[4772]: E0930 18:04:57.899157 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:05:11 crc kubenswrapper[4772]: I0930 18:05:11.898086 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:05:11 crc kubenswrapper[4772]: E0930 18:05:11.902341 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:05:26 crc kubenswrapper[4772]: I0930 18:05:26.898884 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:05:26 crc kubenswrapper[4772]: E0930 18:05:26.899842 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:05:37 crc kubenswrapper[4772]: I0930 18:05:37.899239 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:05:37 crc kubenswrapper[4772]: E0930 18:05:37.900426 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.090098 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kx4v6"] Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.108894 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.146258 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kx4v6"] Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.315260 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-utilities\") pod \"certified-operators-kx4v6\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.315366 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-catalog-content\") pod \"certified-operators-kx4v6\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.315722 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzfrq\" (UniqueName: \"kubernetes.io/projected/0967125c-5511-4324-834f-f4085ba03024-kube-api-access-nzfrq\") pod \"certified-operators-kx4v6\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.418973 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-utilities\") pod \"certified-operators-kx4v6\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.419109 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-catalog-content\") pod \"certified-operators-kx4v6\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.419176 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzfrq\" (UniqueName: \"kubernetes.io/projected/0967125c-5511-4324-834f-f4085ba03024-kube-api-access-nzfrq\") pod \"certified-operators-kx4v6\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.419755 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-utilities\") pod \"certified-operators-kx4v6\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.420019 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-catalog-content\") pod \"certified-operators-kx4v6\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:44 crc kubenswrapper[4772]: I0930 18:05:44.829141 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzfrq\" (UniqueName: \"kubernetes.io/projected/0967125c-5511-4324-834f-f4085ba03024-kube-api-access-nzfrq\") pod \"certified-operators-kx4v6\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:45 crc kubenswrapper[4772]: I0930 18:05:45.067556 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:45 crc kubenswrapper[4772]: I0930 18:05:45.631086 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kx4v6"] Sep 30 18:05:45 crc kubenswrapper[4772]: I0930 18:05:45.973351 4772 generic.go:334] "Generic (PLEG): container finished" podID="0967125c-5511-4324-834f-f4085ba03024" containerID="902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2" exitCode=0 Sep 30 18:05:45 crc kubenswrapper[4772]: I0930 18:05:45.973487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kx4v6" event={"ID":"0967125c-5511-4324-834f-f4085ba03024","Type":"ContainerDied","Data":"902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2"} Sep 30 18:05:45 crc kubenswrapper[4772]: I0930 18:05:45.974328 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kx4v6" event={"ID":"0967125c-5511-4324-834f-f4085ba03024","Type":"ContainerStarted","Data":"b6118357f827299232962aebd6a94a59701eb65e7c3ee6f6045182e372c0a4f8"} Sep 30 18:05:46 crc kubenswrapper[4772]: I0930 18:05:46.989071 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kx4v6" event={"ID":"0967125c-5511-4324-834f-f4085ba03024","Type":"ContainerStarted","Data":"2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc"} Sep 30 18:05:48 crc kubenswrapper[4772]: I0930 18:05:48.003986 4772 generic.go:334] "Generic (PLEG): container finished" podID="0967125c-5511-4324-834f-f4085ba03024" containerID="2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc" exitCode=0 Sep 30 18:05:48 crc kubenswrapper[4772]: I0930 18:05:48.004095 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kx4v6" event={"ID":"0967125c-5511-4324-834f-f4085ba03024","Type":"ContainerDied","Data":"2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc"} Sep 30 18:05:49 crc kubenswrapper[4772]: I0930 18:05:49.020204 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kx4v6" event={"ID":"0967125c-5511-4324-834f-f4085ba03024","Type":"ContainerStarted","Data":"689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062"} Sep 30 18:05:49 crc kubenswrapper[4772]: I0930 18:05:49.053465 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kx4v6" podStartSLOduration=2.578590273 podStartE2EDuration="5.053439459s" podCreationTimestamp="2025-09-30 18:05:44 +0000 UTC" firstStartedPulling="2025-09-30 18:05:45.97557685 +0000 UTC m=+3846.882589671" lastFinishedPulling="2025-09-30 18:05:48.450426026 +0000 UTC m=+3849.357438857" observedRunningTime="2025-09-30 18:05:49.041457184 +0000 UTC m=+3849.948470035" watchObservedRunningTime="2025-09-30 18:05:49.053439459 +0000 UTC m=+3849.960452290" Sep 30 18:05:52 crc kubenswrapper[4772]: I0930 18:05:52.899090 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:05:52 crc kubenswrapper[4772]: E0930 18:05:52.899855 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:05:55 crc kubenswrapper[4772]: I0930 18:05:55.068699 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:55 crc kubenswrapper[4772]: I0930 18:05:55.069074 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:55 crc kubenswrapper[4772]: I0930 18:05:55.128222 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:55 crc kubenswrapper[4772]: I0930 18:05:55.190700 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:55 crc kubenswrapper[4772]: I0930 18:05:55.375427 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kx4v6"] Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.107090 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kx4v6" podUID="0967125c-5511-4324-834f-f4085ba03024" containerName="registry-server" containerID="cri-o://689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062" gracePeriod=2 Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.690403 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.776286 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzfrq\" (UniqueName: \"kubernetes.io/projected/0967125c-5511-4324-834f-f4085ba03024-kube-api-access-nzfrq\") pod \"0967125c-5511-4324-834f-f4085ba03024\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.776678 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-catalog-content\") pod \"0967125c-5511-4324-834f-f4085ba03024\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.776833 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-utilities\") pod \"0967125c-5511-4324-834f-f4085ba03024\" (UID: \"0967125c-5511-4324-834f-f4085ba03024\") " Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.777714 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-utilities" (OuterVolumeSpecName: "utilities") pod "0967125c-5511-4324-834f-f4085ba03024" (UID: "0967125c-5511-4324-834f-f4085ba03024"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.786519 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0967125c-5511-4324-834f-f4085ba03024-kube-api-access-nzfrq" (OuterVolumeSpecName: "kube-api-access-nzfrq") pod "0967125c-5511-4324-834f-f4085ba03024" (UID: "0967125c-5511-4324-834f-f4085ba03024"). InnerVolumeSpecName "kube-api-access-nzfrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.837673 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0967125c-5511-4324-834f-f4085ba03024" (UID: "0967125c-5511-4324-834f-f4085ba03024"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.882341 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.882383 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzfrq\" (UniqueName: \"kubernetes.io/projected/0967125c-5511-4324-834f-f4085ba03024-kube-api-access-nzfrq\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:57 crc kubenswrapper[4772]: I0930 18:05:57.882394 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0967125c-5511-4324-834f-f4085ba03024-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:58 crc kubenswrapper[4772]: E0930 18:05:58.047410 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0967125c_5511_4324_834f_f4085ba03024.slice\": RecentStats: unable to find data in memory cache]" Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.121097 4772 generic.go:334] "Generic (PLEG): container finished" podID="0967125c-5511-4324-834f-f4085ba03024" containerID="689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062" exitCode=0 Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.121274 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kx4v6" event={"ID":"0967125c-5511-4324-834f-f4085ba03024","Type":"ContainerDied","Data":"689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062"} Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.121363 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kx4v6" Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.121445 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kx4v6" event={"ID":"0967125c-5511-4324-834f-f4085ba03024","Type":"ContainerDied","Data":"b6118357f827299232962aebd6a94a59701eb65e7c3ee6f6045182e372c0a4f8"} Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.121472 4772 scope.go:117] "RemoveContainer" containerID="689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062" Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.150992 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kx4v6"] Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.162171 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kx4v6"] Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.172553 4772 scope.go:117] "RemoveContainer" containerID="2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc" Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.871854 4772 scope.go:117] "RemoveContainer" containerID="902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2" Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.897001 4772 scope.go:117] "RemoveContainer" containerID="689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062" Sep 30 18:05:58 crc kubenswrapper[4772]: E0930 18:05:58.897689 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062\": container with ID starting with 689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062 not found: ID does not exist" containerID="689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062" Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.897753 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062"} err="failed to get container status \"689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062\": rpc error: code = NotFound desc = could not find container \"689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062\": container with ID starting with 689fc43dab49ef87cd2407fee35d92496964a75ecb8fe22f8e61095a29380062 not found: ID does not exist" Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.897798 4772 scope.go:117] "RemoveContainer" containerID="2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc" Sep 30 18:05:58 crc kubenswrapper[4772]: E0930 18:05:58.898240 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc\": container with ID starting with 2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc not found: ID does not exist" containerID="2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc" Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.898279 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc"} err="failed to get container status \"2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc\": rpc error: code = NotFound desc = could not find container \"2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc\": container with ID starting with 2c97dc5e8b69f838643cb2642f6b986eaee3539f09f6091c7dfed19d4b5959bc not found: ID does not exist" Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.898308 4772 scope.go:117] "RemoveContainer" containerID="902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2" Sep 30 18:05:58 crc kubenswrapper[4772]: E0930 18:05:58.898627 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2\": container with ID starting with 902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2 not found: ID does not exist" containerID="902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2" Sep 30 18:05:58 crc kubenswrapper[4772]: I0930 18:05:58.898661 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2"} err="failed to get container status \"902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2\": rpc error: code = NotFound desc = could not find container \"902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2\": container with ID starting with 902c44b2112297a03a8f50965d2672543c410cbef3d233263ff222c7d3249ed2 not found: ID does not exist" Sep 30 18:05:59 crc kubenswrapper[4772]: I0930 18:05:59.911017 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0967125c-5511-4324-834f-f4085ba03024" path="/var/lib/kubelet/pods/0967125c-5511-4324-834f-f4085ba03024/volumes" Sep 30 18:06:05 crc kubenswrapper[4772]: I0930 18:06:05.898915 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:06:05 crc kubenswrapper[4772]: E0930 18:06:05.899933 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:06:17 crc kubenswrapper[4772]: I0930 18:06:17.899442 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:06:17 crc kubenswrapper[4772]: E0930 18:06:17.900779 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:06:32 crc kubenswrapper[4772]: I0930 18:06:32.898899 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:06:32 crc kubenswrapper[4772]: E0930 18:06:32.899900 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:06:43 crc kubenswrapper[4772]: I0930 18:06:43.900135 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:06:43 crc kubenswrapper[4772]: E0930 18:06:43.901324 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:06:55 crc kubenswrapper[4772]: I0930 18:06:55.899469 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:06:55 crc kubenswrapper[4772]: E0930 18:06:55.917578 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:07:09 crc kubenswrapper[4772]: I0930 18:07:09.906314 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:07:09 crc kubenswrapper[4772]: E0930 18:07:09.907103 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:07:24 crc kubenswrapper[4772]: I0930 18:07:24.900212 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:07:24 crc kubenswrapper[4772]: E0930 18:07:24.901633 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:07:36 crc kubenswrapper[4772]: I0930 18:07:36.898310 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:07:36 crc kubenswrapper[4772]: E0930 18:07:36.899698 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.214392 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kwhp5"] Sep 30 18:07:42 crc kubenswrapper[4772]: E0930 18:07:42.215873 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0967125c-5511-4324-834f-f4085ba03024" containerName="extract-utilities" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.215890 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0967125c-5511-4324-834f-f4085ba03024" containerName="extract-utilities" Sep 30 18:07:42 crc kubenswrapper[4772]: E0930 18:07:42.215902 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0967125c-5511-4324-834f-f4085ba03024" containerName="extract-content" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.215909 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0967125c-5511-4324-834f-f4085ba03024" containerName="extract-content" Sep 30 18:07:42 crc kubenswrapper[4772]: E0930 18:07:42.215959 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0967125c-5511-4324-834f-f4085ba03024" containerName="registry-server" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.215966 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0967125c-5511-4324-834f-f4085ba03024" containerName="registry-server" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.216224 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0967125c-5511-4324-834f-f4085ba03024" containerName="registry-server" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.218784 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.230202 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kwhp5"] Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.321080 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-utilities\") pod \"community-operators-kwhp5\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.321294 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwp5b\" (UniqueName: \"kubernetes.io/projected/11409e6c-f992-4088-a95c-095402ec741a-kube-api-access-vwp5b\") pod \"community-operators-kwhp5\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.321341 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-catalog-content\") pod \"community-operators-kwhp5\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.425260 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwp5b\" (UniqueName: \"kubernetes.io/projected/11409e6c-f992-4088-a95c-095402ec741a-kube-api-access-vwp5b\") pod \"community-operators-kwhp5\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.425355 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-catalog-content\") pod \"community-operators-kwhp5\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.425476 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-utilities\") pod \"community-operators-kwhp5\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.426098 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-catalog-content\") pod \"community-operators-kwhp5\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.426188 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-utilities\") pod \"community-operators-kwhp5\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.448800 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwp5b\" (UniqueName: \"kubernetes.io/projected/11409e6c-f992-4088-a95c-095402ec741a-kube-api-access-vwp5b\") pod \"community-operators-kwhp5\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:42 crc kubenswrapper[4772]: I0930 18:07:42.547875 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:43 crc kubenswrapper[4772]: I0930 18:07:43.130897 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kwhp5"] Sep 30 18:07:43 crc kubenswrapper[4772]: I0930 18:07:43.358416 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwhp5" event={"ID":"11409e6c-f992-4088-a95c-095402ec741a","Type":"ContainerStarted","Data":"14ad9e2777cc96b7b6788a5d85ef1f3b72e11eb08061c955b941ad89e1fc7543"} Sep 30 18:07:44 crc kubenswrapper[4772]: I0930 18:07:44.373351 4772 generic.go:334] "Generic (PLEG): container finished" podID="11409e6c-f992-4088-a95c-095402ec741a" containerID="55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78" exitCode=0 Sep 30 18:07:44 crc kubenswrapper[4772]: I0930 18:07:44.373444 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwhp5" event={"ID":"11409e6c-f992-4088-a95c-095402ec741a","Type":"ContainerDied","Data":"55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78"} Sep 30 18:07:45 crc kubenswrapper[4772]: I0930 18:07:45.387242 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwhp5" event={"ID":"11409e6c-f992-4088-a95c-095402ec741a","Type":"ContainerStarted","Data":"61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6"} Sep 30 18:07:46 crc kubenswrapper[4772]: I0930 18:07:46.402730 4772 generic.go:334] "Generic (PLEG): container finished" podID="11409e6c-f992-4088-a95c-095402ec741a" containerID="61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6" exitCode=0 Sep 30 18:07:46 crc kubenswrapper[4772]: I0930 18:07:46.402943 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwhp5" event={"ID":"11409e6c-f992-4088-a95c-095402ec741a","Type":"ContainerDied","Data":"61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6"} Sep 30 18:07:47 crc kubenswrapper[4772]: I0930 18:07:47.416538 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwhp5" event={"ID":"11409e6c-f992-4088-a95c-095402ec741a","Type":"ContainerStarted","Data":"8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2"} Sep 30 18:07:47 crc kubenswrapper[4772]: I0930 18:07:47.442026 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kwhp5" podStartSLOduration=2.993199388 podStartE2EDuration="5.441993598s" podCreationTimestamp="2025-09-30 18:07:42 +0000 UTC" firstStartedPulling="2025-09-30 18:07:44.375844718 +0000 UTC m=+3965.282857549" lastFinishedPulling="2025-09-30 18:07:46.824638928 +0000 UTC m=+3967.731651759" observedRunningTime="2025-09-30 18:07:47.435713373 +0000 UTC m=+3968.342726204" watchObservedRunningTime="2025-09-30 18:07:47.441993598 +0000 UTC m=+3968.349006419" Sep 30 18:07:49 crc kubenswrapper[4772]: I0930 18:07:49.906236 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:07:49 crc kubenswrapper[4772]: E0930 18:07:49.906851 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:07:52 crc kubenswrapper[4772]: I0930 18:07:52.548093 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:52 crc kubenswrapper[4772]: I0930 18:07:52.548769 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:52 crc kubenswrapper[4772]: I0930 18:07:52.610188 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:53 crc kubenswrapper[4772]: I0930 18:07:53.523679 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:53 crc kubenswrapper[4772]: I0930 18:07:53.595866 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kwhp5"] Sep 30 18:07:55 crc kubenswrapper[4772]: I0930 18:07:55.495711 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kwhp5" podUID="11409e6c-f992-4088-a95c-095402ec741a" containerName="registry-server" containerID="cri-o://8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2" gracePeriod=2 Sep 30 18:07:55 crc kubenswrapper[4772]: I0930 18:07:55.973610 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.166336 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-catalog-content\") pod \"11409e6c-f992-4088-a95c-095402ec741a\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.166647 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwp5b\" (UniqueName: \"kubernetes.io/projected/11409e6c-f992-4088-a95c-095402ec741a-kube-api-access-vwp5b\") pod \"11409e6c-f992-4088-a95c-095402ec741a\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.166676 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-utilities\") pod \"11409e6c-f992-4088-a95c-095402ec741a\" (UID: \"11409e6c-f992-4088-a95c-095402ec741a\") " Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.168136 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-utilities" (OuterVolumeSpecName: "utilities") pod "11409e6c-f992-4088-a95c-095402ec741a" (UID: "11409e6c-f992-4088-a95c-095402ec741a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.177415 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11409e6c-f992-4088-a95c-095402ec741a-kube-api-access-vwp5b" (OuterVolumeSpecName: "kube-api-access-vwp5b") pod "11409e6c-f992-4088-a95c-095402ec741a" (UID: "11409e6c-f992-4088-a95c-095402ec741a"). InnerVolumeSpecName "kube-api-access-vwp5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.269016 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwp5b\" (UniqueName: \"kubernetes.io/projected/11409e6c-f992-4088-a95c-095402ec741a-kube-api-access-vwp5b\") on node \"crc\" DevicePath \"\"" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.269052 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.507589 4772 generic.go:334] "Generic (PLEG): container finished" podID="11409e6c-f992-4088-a95c-095402ec741a" containerID="8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2" exitCode=0 Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.507648 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwhp5" event={"ID":"11409e6c-f992-4088-a95c-095402ec741a","Type":"ContainerDied","Data":"8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2"} Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.507723 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwhp5" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.507732 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwhp5" event={"ID":"11409e6c-f992-4088-a95c-095402ec741a","Type":"ContainerDied","Data":"14ad9e2777cc96b7b6788a5d85ef1f3b72e11eb08061c955b941ad89e1fc7543"} Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.507782 4772 scope.go:117] "RemoveContainer" containerID="8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.541684 4772 scope.go:117] "RemoveContainer" containerID="61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.582752 4772 scope.go:117] "RemoveContainer" containerID="55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.650354 4772 scope.go:117] "RemoveContainer" containerID="8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2" Sep 30 18:07:56 crc kubenswrapper[4772]: E0930 18:07:56.650911 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2\": container with ID starting with 8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2 not found: ID does not exist" containerID="8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.650950 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2"} err="failed to get container status \"8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2\": rpc error: code = NotFound desc = could not find container \"8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2\": container with ID starting with 8592209f2bda04661b9180301f6970767c34f8e62e88b7603966e7f9deb8acd2 not found: ID does not exist" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.650981 4772 scope.go:117] "RemoveContainer" containerID="61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6" Sep 30 18:07:56 crc kubenswrapper[4772]: E0930 18:07:56.651490 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6\": container with ID starting with 61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6 not found: ID does not exist" containerID="61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.651516 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6"} err="failed to get container status \"61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6\": rpc error: code = NotFound desc = could not find container \"61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6\": container with ID starting with 61474172c4b5206abca70b43a3c076fefa928cf931f0c1db3b04bfcf1595c5c6 not found: ID does not exist" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.651536 4772 scope.go:117] "RemoveContainer" containerID="55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78" Sep 30 18:07:56 crc kubenswrapper[4772]: E0930 18:07:56.651974 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78\": container with ID starting with 55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78 not found: ID does not exist" containerID="55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.651998 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78"} err="failed to get container status \"55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78\": rpc error: code = NotFound desc = could not find container \"55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78\": container with ID starting with 55f7625dd84501975b3051f4105459b22351465a8fa877e3efe9a1c3e4428c78 not found: ID does not exist" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.699746 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11409e6c-f992-4088-a95c-095402ec741a" (UID: "11409e6c-f992-4088-a95c-095402ec741a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.781765 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11409e6c-f992-4088-a95c-095402ec741a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.849934 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kwhp5"] Sep 30 18:07:56 crc kubenswrapper[4772]: I0930 18:07:56.858879 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kwhp5"] Sep 30 18:07:57 crc kubenswrapper[4772]: I0930 18:07:57.909243 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11409e6c-f992-4088-a95c-095402ec741a" path="/var/lib/kubelet/pods/11409e6c-f992-4088-a95c-095402ec741a/volumes" Sep 30 18:08:03 crc kubenswrapper[4772]: I0930 18:08:03.898722 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:08:03 crc kubenswrapper[4772]: E0930 18:08:03.899525 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.167425 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q6mbl"] Sep 30 18:08:14 crc kubenswrapper[4772]: E0930 18:08:14.169184 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11409e6c-f992-4088-a95c-095402ec741a" containerName="registry-server" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.169203 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="11409e6c-f992-4088-a95c-095402ec741a" containerName="registry-server" Sep 30 18:08:14 crc kubenswrapper[4772]: E0930 18:08:14.169235 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11409e6c-f992-4088-a95c-095402ec741a" containerName="extract-utilities" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.169244 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="11409e6c-f992-4088-a95c-095402ec741a" containerName="extract-utilities" Sep 30 18:08:14 crc kubenswrapper[4772]: E0930 18:08:14.169269 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11409e6c-f992-4088-a95c-095402ec741a" containerName="extract-content" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.169277 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="11409e6c-f992-4088-a95c-095402ec741a" containerName="extract-content" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.169569 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="11409e6c-f992-4088-a95c-095402ec741a" containerName="registry-server" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.172882 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.180766 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6mbl"] Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.317383 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-catalog-content\") pod \"redhat-marketplace-q6mbl\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.317536 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5csxt\" (UniqueName: \"kubernetes.io/projected/c3c5daa2-f0a0-424c-9147-1d123897dc8d-kube-api-access-5csxt\") pod \"redhat-marketplace-q6mbl\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.317576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-utilities\") pod \"redhat-marketplace-q6mbl\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.419997 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-catalog-content\") pod \"redhat-marketplace-q6mbl\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.420551 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5csxt\" (UniqueName: \"kubernetes.io/projected/c3c5daa2-f0a0-424c-9147-1d123897dc8d-kube-api-access-5csxt\") pod \"redhat-marketplace-q6mbl\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.420619 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-utilities\") pod \"redhat-marketplace-q6mbl\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.420886 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-catalog-content\") pod \"redhat-marketplace-q6mbl\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.421099 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-utilities\") pod \"redhat-marketplace-q6mbl\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.440920 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5csxt\" (UniqueName: \"kubernetes.io/projected/c3c5daa2-f0a0-424c-9147-1d123897dc8d-kube-api-access-5csxt\") pod \"redhat-marketplace-q6mbl\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:14 crc kubenswrapper[4772]: I0930 18:08:14.524800 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:15 crc kubenswrapper[4772]: I0930 18:08:15.055681 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6mbl"] Sep 30 18:08:15 crc kubenswrapper[4772]: I0930 18:08:15.707111 4772 generic.go:334] "Generic (PLEG): container finished" podID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerID="f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091" exitCode=0 Sep 30 18:08:15 crc kubenswrapper[4772]: I0930 18:08:15.707169 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6mbl" event={"ID":"c3c5daa2-f0a0-424c-9147-1d123897dc8d","Type":"ContainerDied","Data":"f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091"} Sep 30 18:08:15 crc kubenswrapper[4772]: I0930 18:08:15.707405 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6mbl" event={"ID":"c3c5daa2-f0a0-424c-9147-1d123897dc8d","Type":"ContainerStarted","Data":"90ab06bcc7f7464caf10f34e05e083721b685f5bbc3291c9579895c709ba3aea"} Sep 30 18:08:16 crc kubenswrapper[4772]: I0930 18:08:16.726189 4772 generic.go:334] "Generic (PLEG): container finished" podID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerID="1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e" exitCode=0 Sep 30 18:08:16 crc kubenswrapper[4772]: I0930 18:08:16.726279 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6mbl" event={"ID":"c3c5daa2-f0a0-424c-9147-1d123897dc8d","Type":"ContainerDied","Data":"1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e"} Sep 30 18:08:16 crc kubenswrapper[4772]: I0930 18:08:16.898658 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:08:16 crc kubenswrapper[4772]: E0930 18:08:16.898953 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:08:17 crc kubenswrapper[4772]: I0930 18:08:17.739850 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6mbl" event={"ID":"c3c5daa2-f0a0-424c-9147-1d123897dc8d","Type":"ContainerStarted","Data":"9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95"} Sep 30 18:08:17 crc kubenswrapper[4772]: I0930 18:08:17.774264 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q6mbl" podStartSLOduration=2.24956543 podStartE2EDuration="3.77423694s" podCreationTimestamp="2025-09-30 18:08:14 +0000 UTC" firstStartedPulling="2025-09-30 18:08:15.710443197 +0000 UTC m=+3996.617456028" lastFinishedPulling="2025-09-30 18:08:17.235114717 +0000 UTC m=+3998.142127538" observedRunningTime="2025-09-30 18:08:17.763150058 +0000 UTC m=+3998.670162889" watchObservedRunningTime="2025-09-30 18:08:17.77423694 +0000 UTC m=+3998.681249781" Sep 30 18:08:24 crc kubenswrapper[4772]: I0930 18:08:24.524949 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:24 crc kubenswrapper[4772]: I0930 18:08:24.525706 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:24 crc kubenswrapper[4772]: I0930 18:08:24.580371 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:24 crc kubenswrapper[4772]: I0930 18:08:24.864856 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.148445 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6mbl"] Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.149547 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q6mbl" podUID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerName="registry-server" containerID="cri-o://9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95" gracePeriod=2 Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.667589 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.762084 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5csxt\" (UniqueName: \"kubernetes.io/projected/c3c5daa2-f0a0-424c-9147-1d123897dc8d-kube-api-access-5csxt\") pod \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.762189 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-utilities\") pod \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.762443 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-catalog-content\") pod \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\" (UID: \"c3c5daa2-f0a0-424c-9147-1d123897dc8d\") " Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.763197 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-utilities" (OuterVolumeSpecName: "utilities") pod "c3c5daa2-f0a0-424c-9147-1d123897dc8d" (UID: "c3c5daa2-f0a0-424c-9147-1d123897dc8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.786434 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c5daa2-f0a0-424c-9147-1d123897dc8d-kube-api-access-5csxt" (OuterVolumeSpecName: "kube-api-access-5csxt") pod "c3c5daa2-f0a0-424c-9147-1d123897dc8d" (UID: "c3c5daa2-f0a0-424c-9147-1d123897dc8d"). InnerVolumeSpecName "kube-api-access-5csxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.791278 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3c5daa2-f0a0-424c-9147-1d123897dc8d" (UID: "c3c5daa2-f0a0-424c-9147-1d123897dc8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.872955 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.872998 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5csxt\" (UniqueName: \"kubernetes.io/projected/c3c5daa2-f0a0-424c-9147-1d123897dc8d-kube-api-access-5csxt\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.873014 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3c5daa2-f0a0-424c-9147-1d123897dc8d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.875424 4772 generic.go:334] "Generic (PLEG): container finished" podID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerID="9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95" exitCode=0 Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.875487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6mbl" event={"ID":"c3c5daa2-f0a0-424c-9147-1d123897dc8d","Type":"ContainerDied","Data":"9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95"} Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.875601 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6mbl" event={"ID":"c3c5daa2-f0a0-424c-9147-1d123897dc8d","Type":"ContainerDied","Data":"90ab06bcc7f7464caf10f34e05e083721b685f5bbc3291c9579895c709ba3aea"} Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.875633 4772 scope.go:117] "RemoveContainer" containerID="9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.875913 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6mbl" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.957109 4772 scope.go:117] "RemoveContainer" containerID="1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e" Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.958664 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6mbl"] Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.971643 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6mbl"] Sep 30 18:08:27 crc kubenswrapper[4772]: I0930 18:08:27.986990 4772 scope.go:117] "RemoveContainer" containerID="f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091" Sep 30 18:08:28 crc kubenswrapper[4772]: I0930 18:08:28.042454 4772 scope.go:117] "RemoveContainer" containerID="9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95" Sep 30 18:08:28 crc kubenswrapper[4772]: E0930 18:08:28.043140 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95\": container with ID starting with 9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95 not found: ID does not exist" containerID="9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95" Sep 30 18:08:28 crc kubenswrapper[4772]: I0930 18:08:28.043223 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95"} err="failed to get container status \"9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95\": rpc error: code = NotFound desc = could not find container \"9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95\": container with ID starting with 9f06af667a57121d1466aa9cccb40970cde9cb60281607c50255e412f69a5d95 not found: ID does not exist" Sep 30 18:08:28 crc kubenswrapper[4772]: I0930 18:08:28.043277 4772 scope.go:117] "RemoveContainer" containerID="1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e" Sep 30 18:08:28 crc kubenswrapper[4772]: E0930 18:08:28.043966 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e\": container with ID starting with 1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e not found: ID does not exist" containerID="1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e" Sep 30 18:08:28 crc kubenswrapper[4772]: I0930 18:08:28.044030 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e"} err="failed to get container status \"1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e\": rpc error: code = NotFound desc = could not find container \"1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e\": container with ID starting with 1866f25d7010da7637d8133384f4727749cebe1f18d9157f07529a8eeed2d41e not found: ID does not exist" Sep 30 18:08:28 crc kubenswrapper[4772]: I0930 18:08:28.044214 4772 scope.go:117] "RemoveContainer" containerID="f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091" Sep 30 18:08:28 crc kubenswrapper[4772]: E0930 18:08:28.044628 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091\": container with ID starting with f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091 not found: ID does not exist" containerID="f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091" Sep 30 18:08:28 crc kubenswrapper[4772]: I0930 18:08:28.044676 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091"} err="failed to get container status \"f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091\": rpc error: code = NotFound desc = could not find container \"f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091\": container with ID starting with f5e0ae1b9915949e53433152300a5cbc03afdc1c6b575afa720a924d4641a091 not found: ID does not exist" Sep 30 18:08:29 crc kubenswrapper[4772]: I0930 18:08:29.913911 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" path="/var/lib/kubelet/pods/c3c5daa2-f0a0-424c-9147-1d123897dc8d/volumes" Sep 30 18:08:31 crc kubenswrapper[4772]: I0930 18:08:31.898852 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:08:31 crc kubenswrapper[4772]: E0930 18:08:31.899778 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:08:42 crc kubenswrapper[4772]: I0930 18:08:42.898989 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:08:44 crc kubenswrapper[4772]: I0930 18:08:44.060202 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"d9a21cc6b5e8b6e83810bceaee9cfc86f5d0d4e53f16662de544a33074321292"} Sep 30 18:08:45 crc kubenswrapper[4772]: I0930 18:08:45.790581 4772 scope.go:117] "RemoveContainer" containerID="f91d99c16d576db2dcb2eb78f60530bd8eb5726f9a2b7bd8122f71b6b9516ffa" Sep 30 18:08:45 crc kubenswrapper[4772]: I0930 18:08:45.820617 4772 scope.go:117] "RemoveContainer" containerID="bc033ecc4eaa7aca84ef85194f5f85ba76d158a2042d4a295bb644e481f6f8d7" Sep 30 18:08:45 crc kubenswrapper[4772]: I0930 18:08:45.995285 4772 scope.go:117] "RemoveContainer" containerID="f3d00f6fc42877e79b08e1b028e0e7c8b7a535003bdb740acdfeda4b4c4de131" Sep 30 18:08:46 crc kubenswrapper[4772]: I0930 18:08:46.169461 4772 scope.go:117] "RemoveContainer" containerID="98bc3da918ed97847dead891c58e965e117ba7a01ac2abb8d50f793dbb7f86d1" Sep 30 18:11:08 crc kubenswrapper[4772]: I0930 18:11:08.655551 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:11:08 crc kubenswrapper[4772]: I0930 18:11:08.656162 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.504870 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-65k4d"] Sep 30 18:11:16 crc kubenswrapper[4772]: E0930 18:11:16.506213 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerName="extract-content" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.506234 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerName="extract-content" Sep 30 18:11:16 crc kubenswrapper[4772]: E0930 18:11:16.506253 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerName="registry-server" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.506261 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerName="registry-server" Sep 30 18:11:16 crc kubenswrapper[4772]: E0930 18:11:16.506295 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerName="extract-utilities" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.506304 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerName="extract-utilities" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.506583 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c5daa2-f0a0-424c-9147-1d123897dc8d" containerName="registry-server" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.508354 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.528284 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-65k4d"] Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.561440 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-utilities\") pod \"redhat-operators-65k4d\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.561561 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-catalog-content\") pod \"redhat-operators-65k4d\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.561660 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcx4n\" (UniqueName: \"kubernetes.io/projected/e62d4842-d97f-4c1e-b08f-c0840d2f5306-kube-api-access-fcx4n\") pod \"redhat-operators-65k4d\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.664562 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-utilities\") pod \"redhat-operators-65k4d\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.664657 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-catalog-content\") pod \"redhat-operators-65k4d\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.664733 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcx4n\" (UniqueName: \"kubernetes.io/projected/e62d4842-d97f-4c1e-b08f-c0840d2f5306-kube-api-access-fcx4n\") pod \"redhat-operators-65k4d\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.665336 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-catalog-content\") pod \"redhat-operators-65k4d\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.665333 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-utilities\") pod \"redhat-operators-65k4d\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.691005 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcx4n\" (UniqueName: \"kubernetes.io/projected/e62d4842-d97f-4c1e-b08f-c0840d2f5306-kube-api-access-fcx4n\") pod \"redhat-operators-65k4d\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:16 crc kubenswrapper[4772]: I0930 18:11:16.857979 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:17 crc kubenswrapper[4772]: I0930 18:11:17.452956 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-65k4d"] Sep 30 18:11:17 crc kubenswrapper[4772]: I0930 18:11:17.709417 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65k4d" event={"ID":"e62d4842-d97f-4c1e-b08f-c0840d2f5306","Type":"ContainerStarted","Data":"357db1797f9e96ce19667f3d2e2f519f0c470349a6552ebe75b84457a3bd2806"} Sep 30 18:11:17 crc kubenswrapper[4772]: I0930 18:11:17.709470 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65k4d" event={"ID":"e62d4842-d97f-4c1e-b08f-c0840d2f5306","Type":"ContainerStarted","Data":"27ab8cf5cb23aa93f0182ec10c7c49a17a97776920ef6987dcefe1872986fb9b"} Sep 30 18:11:18 crc kubenswrapper[4772]: I0930 18:11:18.722295 4772 generic.go:334] "Generic (PLEG): container finished" podID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerID="357db1797f9e96ce19667f3d2e2f519f0c470349a6552ebe75b84457a3bd2806" exitCode=0 Sep 30 18:11:18 crc kubenswrapper[4772]: I0930 18:11:18.722379 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65k4d" event={"ID":"e62d4842-d97f-4c1e-b08f-c0840d2f5306","Type":"ContainerDied","Data":"357db1797f9e96ce19667f3d2e2f519f0c470349a6552ebe75b84457a3bd2806"} Sep 30 18:11:18 crc kubenswrapper[4772]: I0930 18:11:18.725664 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:11:20 crc kubenswrapper[4772]: I0930 18:11:20.750665 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65k4d" event={"ID":"e62d4842-d97f-4c1e-b08f-c0840d2f5306","Type":"ContainerStarted","Data":"46d621d2a41b1d720d7d675a4109e60888399ea26e03737424a18ccf18762c74"} Sep 30 18:11:21 crc kubenswrapper[4772]: I0930 18:11:21.763366 4772 generic.go:334] "Generic (PLEG): container finished" podID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerID="46d621d2a41b1d720d7d675a4109e60888399ea26e03737424a18ccf18762c74" exitCode=0 Sep 30 18:11:21 crc kubenswrapper[4772]: I0930 18:11:21.763419 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65k4d" event={"ID":"e62d4842-d97f-4c1e-b08f-c0840d2f5306","Type":"ContainerDied","Data":"46d621d2a41b1d720d7d675a4109e60888399ea26e03737424a18ccf18762c74"} Sep 30 18:11:22 crc kubenswrapper[4772]: I0930 18:11:22.776503 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65k4d" event={"ID":"e62d4842-d97f-4c1e-b08f-c0840d2f5306","Type":"ContainerStarted","Data":"d64d38a47f7e8952d2c76f1dd0acabf07d095c6e2de6280faa0ff51ee60d8ff2"} Sep 30 18:11:22 crc kubenswrapper[4772]: I0930 18:11:22.797985 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-65k4d" podStartSLOduration=3.182405267 podStartE2EDuration="6.797965848s" podCreationTimestamp="2025-09-30 18:11:16 +0000 UTC" firstStartedPulling="2025-09-30 18:11:18.725410396 +0000 UTC m=+4179.632423227" lastFinishedPulling="2025-09-30 18:11:22.340970987 +0000 UTC m=+4183.247983808" observedRunningTime="2025-09-30 18:11:22.79497684 +0000 UTC m=+4183.701989691" watchObservedRunningTime="2025-09-30 18:11:22.797965848 +0000 UTC m=+4183.704978679" Sep 30 18:11:26 crc kubenswrapper[4772]: I0930 18:11:26.858179 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:26 crc kubenswrapper[4772]: I0930 18:11:26.858850 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:28 crc kubenswrapper[4772]: I0930 18:11:28.082783 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-65k4d" podUID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerName="registry-server" probeResult="failure" output=< Sep 30 18:11:28 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Sep 30 18:11:28 crc kubenswrapper[4772]: > Sep 30 18:11:36 crc kubenswrapper[4772]: I0930 18:11:36.918969 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:36 crc kubenswrapper[4772]: I0930 18:11:36.981992 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:37 crc kubenswrapper[4772]: I0930 18:11:37.164092 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-65k4d"] Sep 30 18:11:38 crc kubenswrapper[4772]: I0930 18:11:38.655792 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:11:38 crc kubenswrapper[4772]: I0930 18:11:38.655863 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:11:38 crc kubenswrapper[4772]: I0930 18:11:38.940942 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-65k4d" podUID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerName="registry-server" containerID="cri-o://d64d38a47f7e8952d2c76f1dd0acabf07d095c6e2de6280faa0ff51ee60d8ff2" gracePeriod=2 Sep 30 18:11:39 crc kubenswrapper[4772]: I0930 18:11:39.964349 4772 generic.go:334] "Generic (PLEG): container finished" podID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerID="d64d38a47f7e8952d2c76f1dd0acabf07d095c6e2de6280faa0ff51ee60d8ff2" exitCode=0 Sep 30 18:11:39 crc kubenswrapper[4772]: I0930 18:11:39.964831 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65k4d" event={"ID":"e62d4842-d97f-4c1e-b08f-c0840d2f5306","Type":"ContainerDied","Data":"d64d38a47f7e8952d2c76f1dd0acabf07d095c6e2de6280faa0ff51ee60d8ff2"} Sep 30 18:11:39 crc kubenswrapper[4772]: I0930 18:11:39.964882 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65k4d" event={"ID":"e62d4842-d97f-4c1e-b08f-c0840d2f5306","Type":"ContainerDied","Data":"27ab8cf5cb23aa93f0182ec10c7c49a17a97776920ef6987dcefe1872986fb9b"} Sep 30 18:11:39 crc kubenswrapper[4772]: I0930 18:11:39.964905 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27ab8cf5cb23aa93f0182ec10c7c49a17a97776920ef6987dcefe1872986fb9b" Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.040200 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.100494 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-utilities\") pod \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.100821 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcx4n\" (UniqueName: \"kubernetes.io/projected/e62d4842-d97f-4c1e-b08f-c0840d2f5306-kube-api-access-fcx4n\") pod \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.100961 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-catalog-content\") pod \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\" (UID: \"e62d4842-d97f-4c1e-b08f-c0840d2f5306\") " Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.105094 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-utilities" (OuterVolumeSpecName: "utilities") pod "e62d4842-d97f-4c1e-b08f-c0840d2f5306" (UID: "e62d4842-d97f-4c1e-b08f-c0840d2f5306"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.116715 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e62d4842-d97f-4c1e-b08f-c0840d2f5306-kube-api-access-fcx4n" (OuterVolumeSpecName: "kube-api-access-fcx4n") pod "e62d4842-d97f-4c1e-b08f-c0840d2f5306" (UID: "e62d4842-d97f-4c1e-b08f-c0840d2f5306"). InnerVolumeSpecName "kube-api-access-fcx4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.201018 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e62d4842-d97f-4c1e-b08f-c0840d2f5306" (UID: "e62d4842-d97f-4c1e-b08f-c0840d2f5306"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.208551 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcx4n\" (UniqueName: \"kubernetes.io/projected/e62d4842-d97f-4c1e-b08f-c0840d2f5306-kube-api-access-fcx4n\") on node \"crc\" DevicePath \"\"" Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.208833 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.208894 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62d4842-d97f-4c1e-b08f-c0840d2f5306-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:11:40 crc kubenswrapper[4772]: I0930 18:11:40.977164 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65k4d" Sep 30 18:11:41 crc kubenswrapper[4772]: I0930 18:11:41.035540 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-65k4d"] Sep 30 18:11:41 crc kubenswrapper[4772]: I0930 18:11:41.045618 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-65k4d"] Sep 30 18:11:41 crc kubenswrapper[4772]: I0930 18:11:41.912481 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" path="/var/lib/kubelet/pods/e62d4842-d97f-4c1e-b08f-c0840d2f5306/volumes" Sep 30 18:12:08 crc kubenswrapper[4772]: I0930 18:12:08.655502 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:12:08 crc kubenswrapper[4772]: I0930 18:12:08.656451 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:12:08 crc kubenswrapper[4772]: I0930 18:12:08.656521 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 18:12:08 crc kubenswrapper[4772]: I0930 18:12:08.657767 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9a21cc6b5e8b6e83810bceaee9cfc86f5d0d4e53f16662de544a33074321292"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:12:08 crc kubenswrapper[4772]: I0930 18:12:08.657827 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://d9a21cc6b5e8b6e83810bceaee9cfc86f5d0d4e53f16662de544a33074321292" gracePeriod=600 Sep 30 18:12:09 crc kubenswrapper[4772]: I0930 18:12:09.299070 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="d9a21cc6b5e8b6e83810bceaee9cfc86f5d0d4e53f16662de544a33074321292" exitCode=0 Sep 30 18:12:09 crc kubenswrapper[4772]: I0930 18:12:09.299102 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"d9a21cc6b5e8b6e83810bceaee9cfc86f5d0d4e53f16662de544a33074321292"} Sep 30 18:12:09 crc kubenswrapper[4772]: I0930 18:12:09.299564 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4"} Sep 30 18:12:09 crc kubenswrapper[4772]: I0930 18:12:09.299597 4772 scope.go:117] "RemoveContainer" containerID="85cc0901b712ace163eb185e6df8fdbbfa1307627ab6ac5741c438d9bc4393bb" Sep 30 18:14:27 crc kubenswrapper[4772]: I0930 18:14:27.112600 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-684cbd44c-xstzf" podUID="d878293c-0383-4575-95cb-1062bcb4634e" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Sep 30 18:14:38 crc kubenswrapper[4772]: I0930 18:14:38.655105 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:14:38 crc kubenswrapper[4772]: I0930 18:14:38.655679 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.154835 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl"] Sep 30 18:15:00 crc kubenswrapper[4772]: E0930 18:15:00.158128 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerName="extract-content" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.158147 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerName="extract-content" Sep 30 18:15:00 crc kubenswrapper[4772]: E0930 18:15:00.158160 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerName="registry-server" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.158166 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerName="registry-server" Sep 30 18:15:00 crc kubenswrapper[4772]: E0930 18:15:00.158239 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerName="extract-utilities" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.158246 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerName="extract-utilities" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.158527 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e62d4842-d97f-4c1e-b08f-c0840d2f5306" containerName="registry-server" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.159371 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.163298 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.172201 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.175843 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl"] Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.197657 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb7hp\" (UniqueName: \"kubernetes.io/projected/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-kube-api-access-pb7hp\") pod \"collect-profiles-29320935-9c2gl\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.197879 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-secret-volume\") pod \"collect-profiles-29320935-9c2gl\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.198234 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-config-volume\") pod \"collect-profiles-29320935-9c2gl\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.300917 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-secret-volume\") pod \"collect-profiles-29320935-9c2gl\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.301022 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-config-volume\") pod \"collect-profiles-29320935-9c2gl\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.301185 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb7hp\" (UniqueName: \"kubernetes.io/projected/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-kube-api-access-pb7hp\") pod \"collect-profiles-29320935-9c2gl\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.302143 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-config-volume\") pod \"collect-profiles-29320935-9c2gl\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.315575 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-secret-volume\") pod \"collect-profiles-29320935-9c2gl\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.318592 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb7hp\" (UniqueName: \"kubernetes.io/projected/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-kube-api-access-pb7hp\") pod \"collect-profiles-29320935-9c2gl\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:00 crc kubenswrapper[4772]: I0930 18:15:00.482205 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:01 crc kubenswrapper[4772]: I0930 18:15:01.587077 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl"] Sep 30 18:15:02 crc kubenswrapper[4772]: I0930 18:15:02.221114 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" event={"ID":"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc","Type":"ContainerStarted","Data":"2f12b2be0082f006cc14ad0563b4f27ed5cf06bac043f549eab4587b1a6a6a0d"} Sep 30 18:15:02 crc kubenswrapper[4772]: I0930 18:15:02.222407 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" event={"ID":"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc","Type":"ContainerStarted","Data":"0d9b04c9abfa000894112231fa4cb0f069238694f1a6eecbbb827116831e3bc1"} Sep 30 18:15:02 crc kubenswrapper[4772]: I0930 18:15:02.249549 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" podStartSLOduration=2.249522129 podStartE2EDuration="2.249522129s" podCreationTimestamp="2025-09-30 18:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:15:02.239600439 +0000 UTC m=+4403.146613270" watchObservedRunningTime="2025-09-30 18:15:02.249522129 +0000 UTC m=+4403.156534970" Sep 30 18:15:03 crc kubenswrapper[4772]: I0930 18:15:03.232006 4772 generic.go:334] "Generic (PLEG): container finished" podID="a5c8106c-b8cc-42be-b81f-99bc7d42d7cc" containerID="2f12b2be0082f006cc14ad0563b4f27ed5cf06bac043f549eab4587b1a6a6a0d" exitCode=0 Sep 30 18:15:03 crc kubenswrapper[4772]: I0930 18:15:03.232101 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" event={"ID":"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc","Type":"ContainerDied","Data":"2f12b2be0082f006cc14ad0563b4f27ed5cf06bac043f549eab4587b1a6a6a0d"} Sep 30 18:15:04 crc kubenswrapper[4772]: I0930 18:15:04.703576 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:04 crc kubenswrapper[4772]: I0930 18:15:04.816299 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb7hp\" (UniqueName: \"kubernetes.io/projected/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-kube-api-access-pb7hp\") pod \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " Sep 30 18:15:04 crc kubenswrapper[4772]: I0930 18:15:04.816589 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-secret-volume\") pod \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " Sep 30 18:15:04 crc kubenswrapper[4772]: I0930 18:15:04.816845 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-config-volume\") pod \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\" (UID: \"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc\") " Sep 30 18:15:04 crc kubenswrapper[4772]: I0930 18:15:04.817456 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-config-volume" (OuterVolumeSpecName: "config-volume") pod "a5c8106c-b8cc-42be-b81f-99bc7d42d7cc" (UID: "a5c8106c-b8cc-42be-b81f-99bc7d42d7cc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:15:04 crc kubenswrapper[4772]: I0930 18:15:04.818463 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:15:04 crc kubenswrapper[4772]: I0930 18:15:04.824373 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-kube-api-access-pb7hp" (OuterVolumeSpecName: "kube-api-access-pb7hp") pod "a5c8106c-b8cc-42be-b81f-99bc7d42d7cc" (UID: "a5c8106c-b8cc-42be-b81f-99bc7d42d7cc"). InnerVolumeSpecName "kube-api-access-pb7hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:15:04 crc kubenswrapper[4772]: I0930 18:15:04.827277 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a5c8106c-b8cc-42be-b81f-99bc7d42d7cc" (UID: "a5c8106c-b8cc-42be-b81f-99bc7d42d7cc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:15:04 crc kubenswrapper[4772]: I0930 18:15:04.921435 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:15:04 crc kubenswrapper[4772]: I0930 18:15:04.921485 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb7hp\" (UniqueName: \"kubernetes.io/projected/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc-kube-api-access-pb7hp\") on node \"crc\" DevicePath \"\"" Sep 30 18:15:05 crc kubenswrapper[4772]: I0930 18:15:05.260594 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" event={"ID":"a5c8106c-b8cc-42be-b81f-99bc7d42d7cc","Type":"ContainerDied","Data":"0d9b04c9abfa000894112231fa4cb0f069238694f1a6eecbbb827116831e3bc1"} Sep 30 18:15:05 crc kubenswrapper[4772]: I0930 18:15:05.260938 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d9b04c9abfa000894112231fa4cb0f069238694f1a6eecbbb827116831e3bc1" Sep 30 18:15:05 crc kubenswrapper[4772]: I0930 18:15:05.260691 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl" Sep 30 18:15:05 crc kubenswrapper[4772]: I0930 18:15:05.783730 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq"] Sep 30 18:15:05 crc kubenswrapper[4772]: I0930 18:15:05.791931 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-vtqmq"] Sep 30 18:15:05 crc kubenswrapper[4772]: I0930 18:15:05.912190 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c870583b-ddcc-4939-94ae-f192d0ed0f2b" path="/var/lib/kubelet/pods/c870583b-ddcc-4939-94ae-f192d0ed0f2b/volumes" Sep 30 18:15:08 crc kubenswrapper[4772]: I0930 18:15:08.655553 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:15:08 crc kubenswrapper[4772]: I0930 18:15:08.657240 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:15:38 crc kubenswrapper[4772]: I0930 18:15:38.655621 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:15:38 crc kubenswrapper[4772]: I0930 18:15:38.656524 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:15:38 crc kubenswrapper[4772]: I0930 18:15:38.656587 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 18:15:38 crc kubenswrapper[4772]: I0930 18:15:38.657787 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:15:38 crc kubenswrapper[4772]: I0930 18:15:38.657867 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" gracePeriod=600 Sep 30 18:15:38 crc kubenswrapper[4772]: E0930 18:15:38.789096 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:15:39 crc kubenswrapper[4772]: I0930 18:15:39.612542 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" exitCode=0 Sep 30 18:15:39 crc kubenswrapper[4772]: I0930 18:15:39.612602 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4"} Sep 30 18:15:39 crc kubenswrapper[4772]: I0930 18:15:39.612655 4772 scope.go:117] "RemoveContainer" containerID="d9a21cc6b5e8b6e83810bceaee9cfc86f5d0d4e53f16662de544a33074321292" Sep 30 18:15:39 crc kubenswrapper[4772]: I0930 18:15:39.613627 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:15:39 crc kubenswrapper[4772]: E0930 18:15:39.614036 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:15:46 crc kubenswrapper[4772]: I0930 18:15:46.405212 4772 scope.go:117] "RemoveContainer" containerID="ae2113f964fc5355023f4ab0ac80219cbe6afa876d32608300b0a44cfaaf71b7" Sep 30 18:15:51 crc kubenswrapper[4772]: I0930 18:15:51.898450 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:15:51 crc kubenswrapper[4772]: E0930 18:15:51.899445 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.430814 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hfxj6"] Sep 30 18:16:03 crc kubenswrapper[4772]: E0930 18:16:03.433108 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c8106c-b8cc-42be-b81f-99bc7d42d7cc" containerName="collect-profiles" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.433214 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c8106c-b8cc-42be-b81f-99bc7d42d7cc" containerName="collect-profiles" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.433709 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c8106c-b8cc-42be-b81f-99bc7d42d7cc" containerName="collect-profiles" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.435600 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.442676 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hfxj6"] Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.488821 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-utilities\") pod \"certified-operators-hfxj6\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.489020 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-catalog-content\") pod \"certified-operators-hfxj6\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.489471 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8tqp\" (UniqueName: \"kubernetes.io/projected/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-kube-api-access-h8tqp\") pod \"certified-operators-hfxj6\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.592826 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8tqp\" (UniqueName: \"kubernetes.io/projected/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-kube-api-access-h8tqp\") pod \"certified-operators-hfxj6\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.592927 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-utilities\") pod \"certified-operators-hfxj6\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.593169 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-catalog-content\") pod \"certified-operators-hfxj6\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.593647 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-utilities\") pod \"certified-operators-hfxj6\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.593690 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-catalog-content\") pod \"certified-operators-hfxj6\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.612914 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8tqp\" (UniqueName: \"kubernetes.io/projected/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-kube-api-access-h8tqp\") pod \"certified-operators-hfxj6\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:03 crc kubenswrapper[4772]: I0930 18:16:03.758519 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:04 crc kubenswrapper[4772]: I0930 18:16:04.317816 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hfxj6"] Sep 30 18:16:04 crc kubenswrapper[4772]: I0930 18:16:04.919246 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfxj6" event={"ID":"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8","Type":"ContainerStarted","Data":"9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76"} Sep 30 18:16:04 crc kubenswrapper[4772]: I0930 18:16:04.919521 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfxj6" event={"ID":"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8","Type":"ContainerStarted","Data":"662d21fb739833c782e7b554bc711cf275db8942a83527a62c09dd3759f08593"} Sep 30 18:16:05 crc kubenswrapper[4772]: I0930 18:16:05.898002 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:16:05 crc kubenswrapper[4772]: E0930 18:16:05.898582 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:16:05 crc kubenswrapper[4772]: I0930 18:16:05.948529 4772 generic.go:334] "Generic (PLEG): container finished" podID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerID="9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76" exitCode=0 Sep 30 18:16:05 crc kubenswrapper[4772]: I0930 18:16:05.948574 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfxj6" event={"ID":"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8","Type":"ContainerDied","Data":"9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76"} Sep 30 18:16:06 crc kubenswrapper[4772]: I0930 18:16:06.967453 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfxj6" event={"ID":"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8","Type":"ContainerStarted","Data":"681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db"} Sep 30 18:16:07 crc kubenswrapper[4772]: I0930 18:16:07.985049 4772 generic.go:334] "Generic (PLEG): container finished" podID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerID="681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db" exitCode=0 Sep 30 18:16:07 crc kubenswrapper[4772]: I0930 18:16:07.985180 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfxj6" event={"ID":"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8","Type":"ContainerDied","Data":"681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db"} Sep 30 18:16:08 crc kubenswrapper[4772]: I0930 18:16:08.996733 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfxj6" event={"ID":"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8","Type":"ContainerStarted","Data":"b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c"} Sep 30 18:16:09 crc kubenswrapper[4772]: I0930 18:16:09.020517 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hfxj6" podStartSLOduration=3.549913012 podStartE2EDuration="6.020499315s" podCreationTimestamp="2025-09-30 18:16:03 +0000 UTC" firstStartedPulling="2025-09-30 18:16:05.950685055 +0000 UTC m=+4466.857697886" lastFinishedPulling="2025-09-30 18:16:08.421271358 +0000 UTC m=+4469.328284189" observedRunningTime="2025-09-30 18:16:09.016047659 +0000 UTC m=+4469.923060490" watchObservedRunningTime="2025-09-30 18:16:09.020499315 +0000 UTC m=+4469.927512146" Sep 30 18:16:13 crc kubenswrapper[4772]: I0930 18:16:13.758995 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:13 crc kubenswrapper[4772]: I0930 18:16:13.759816 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:13 crc kubenswrapper[4772]: I0930 18:16:13.810722 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:14 crc kubenswrapper[4772]: I0930 18:16:14.106430 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:14 crc kubenswrapper[4772]: I0930 18:16:14.162454 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hfxj6"] Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.072752 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hfxj6" podUID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerName="registry-server" containerID="cri-o://b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c" gracePeriod=2 Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.697534 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.807279 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-utilities\") pod \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.807395 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-catalog-content\") pod \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.807535 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8tqp\" (UniqueName: \"kubernetes.io/projected/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-kube-api-access-h8tqp\") pod \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\" (UID: \"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8\") " Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.814921 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-utilities" (OuterVolumeSpecName: "utilities") pod "7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" (UID: "7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.817607 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-kube-api-access-h8tqp" (OuterVolumeSpecName: "kube-api-access-h8tqp") pod "7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" (UID: "7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8"). InnerVolumeSpecName "kube-api-access-h8tqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.866365 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" (UID: "7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.911304 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.911347 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8tqp\" (UniqueName: \"kubernetes.io/projected/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-kube-api-access-h8tqp\") on node \"crc\" DevicePath \"\"" Sep 30 18:16:16 crc kubenswrapper[4772]: I0930 18:16:16.911358 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.086722 4772 generic.go:334] "Generic (PLEG): container finished" podID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerID="b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c" exitCode=0 Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.086783 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfxj6" event={"ID":"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8","Type":"ContainerDied","Data":"b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c"} Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.086802 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfxj6" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.086820 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfxj6" event={"ID":"7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8","Type":"ContainerDied","Data":"662d21fb739833c782e7b554bc711cf275db8942a83527a62c09dd3759f08593"} Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.086843 4772 scope.go:117] "RemoveContainer" containerID="b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.111657 4772 scope.go:117] "RemoveContainer" containerID="681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.133243 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hfxj6"] Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.149276 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hfxj6"] Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.657034 4772 scope.go:117] "RemoveContainer" containerID="9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.679374 4772 scope.go:117] "RemoveContainer" containerID="b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c" Sep 30 18:16:17 crc kubenswrapper[4772]: E0930 18:16:17.679899 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c\": container with ID starting with b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c not found: ID does not exist" containerID="b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.679950 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c"} err="failed to get container status \"b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c\": rpc error: code = NotFound desc = could not find container \"b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c\": container with ID starting with b4c1782548c8a22e0ea3bf46af6245e0ff6d2c837a9641ba42e50d4400ff0b5c not found: ID does not exist" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.679989 4772 scope.go:117] "RemoveContainer" containerID="681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db" Sep 30 18:16:17 crc kubenswrapper[4772]: E0930 18:16:17.680992 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db\": container with ID starting with 681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db not found: ID does not exist" containerID="681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.681029 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db"} err="failed to get container status \"681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db\": rpc error: code = NotFound desc = could not find container \"681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db\": container with ID starting with 681391adce21247bf98be3bd65b65b70f25ebb19501d3771aab2ca095eaa54db not found: ID does not exist" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.681074 4772 scope.go:117] "RemoveContainer" containerID="9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76" Sep 30 18:16:17 crc kubenswrapper[4772]: E0930 18:16:17.681504 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76\": container with ID starting with 9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76 not found: ID does not exist" containerID="9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.681526 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76"} err="failed to get container status \"9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76\": rpc error: code = NotFound desc = could not find container \"9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76\": container with ID starting with 9dc69b458962fcb46989bf1cc5b557e600d01f04e3dfd516ef9829751549ac76 not found: ID does not exist" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.898711 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:16:17 crc kubenswrapper[4772]: E0930 18:16:17.899219 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:16:17 crc kubenswrapper[4772]: I0930 18:16:17.909956 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" path="/var/lib/kubelet/pods/7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8/volumes" Sep 30 18:16:32 crc kubenswrapper[4772]: I0930 18:16:32.898173 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:16:32 crc kubenswrapper[4772]: E0930 18:16:32.898968 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:16:44 crc kubenswrapper[4772]: I0930 18:16:44.900144 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:16:44 crc kubenswrapper[4772]: E0930 18:16:44.901114 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:16:57 crc kubenswrapper[4772]: I0930 18:16:57.899337 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:16:57 crc kubenswrapper[4772]: E0930 18:16:57.900295 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:17:10 crc kubenswrapper[4772]: I0930 18:17:10.901813 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:17:10 crc kubenswrapper[4772]: E0930 18:17:10.902848 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:17:22 crc kubenswrapper[4772]: I0930 18:17:22.898742 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:17:22 crc kubenswrapper[4772]: E0930 18:17:22.899594 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:17:36 crc kubenswrapper[4772]: I0930 18:17:36.900206 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:17:36 crc kubenswrapper[4772]: E0930 18:17:36.901499 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:17:46 crc kubenswrapper[4772]: I0930 18:17:46.504469 4772 scope.go:117] "RemoveContainer" containerID="d64d38a47f7e8952d2c76f1dd0acabf07d095c6e2de6280faa0ff51ee60d8ff2" Sep 30 18:17:46 crc kubenswrapper[4772]: I0930 18:17:46.527078 4772 scope.go:117] "RemoveContainer" containerID="357db1797f9e96ce19667f3d2e2f519f0c470349a6552ebe75b84457a3bd2806" Sep 30 18:17:46 crc kubenswrapper[4772]: I0930 18:17:46.555644 4772 scope.go:117] "RemoveContainer" containerID="46d621d2a41b1d720d7d675a4109e60888399ea26e03737424a18ccf18762c74" Sep 30 18:17:51 crc kubenswrapper[4772]: I0930 18:17:51.899714 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:17:51 crc kubenswrapper[4772]: E0930 18:17:51.900982 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:18:05 crc kubenswrapper[4772]: I0930 18:18:05.899935 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:18:05 crc kubenswrapper[4772]: E0930 18:18:05.901639 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:18:18 crc kubenswrapper[4772]: I0930 18:18:18.898578 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:18:18 crc kubenswrapper[4772]: E0930 18:18:18.899451 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:18:19 crc kubenswrapper[4772]: I0930 18:18:19.956238 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l45k5"] Sep 30 18:18:19 crc kubenswrapper[4772]: E0930 18:18:19.957972 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerName="extract-content" Sep 30 18:18:19 crc kubenswrapper[4772]: I0930 18:18:19.958000 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerName="extract-content" Sep 30 18:18:19 crc kubenswrapper[4772]: E0930 18:18:19.958019 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerName="extract-utilities" Sep 30 18:18:19 crc kubenswrapper[4772]: I0930 18:18:19.958026 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerName="extract-utilities" Sep 30 18:18:19 crc kubenswrapper[4772]: E0930 18:18:19.958079 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerName="registry-server" Sep 30 18:18:19 crc kubenswrapper[4772]: I0930 18:18:19.958086 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerName="registry-server" Sep 30 18:18:19 crc kubenswrapper[4772]: I0930 18:18:19.995762 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd369c5-f0e2-4f57-87e3-a9c7f28b55d8" containerName="registry-server" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.001167 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l45k5"] Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.001314 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.178082 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtgr5\" (UniqueName: \"kubernetes.io/projected/53d8cae8-3ee9-4778-993c-07903200861f-kube-api-access-gtgr5\") pod \"community-operators-l45k5\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.178419 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-utilities\") pod \"community-operators-l45k5\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.178497 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-catalog-content\") pod \"community-operators-l45k5\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.281120 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtgr5\" (UniqueName: \"kubernetes.io/projected/53d8cae8-3ee9-4778-993c-07903200861f-kube-api-access-gtgr5\") pod \"community-operators-l45k5\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.281259 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-utilities\") pod \"community-operators-l45k5\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.281352 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-catalog-content\") pod \"community-operators-l45k5\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.281987 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-catalog-content\") pod \"community-operators-l45k5\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.282679 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-utilities\") pod \"community-operators-l45k5\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.322460 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtgr5\" (UniqueName: \"kubernetes.io/projected/53d8cae8-3ee9-4778-993c-07903200861f-kube-api-access-gtgr5\") pod \"community-operators-l45k5\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:20 crc kubenswrapper[4772]: I0930 18:18:20.337598 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:21 crc kubenswrapper[4772]: I0930 18:18:21.045089 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l45k5"] Sep 30 18:18:21 crc kubenswrapper[4772]: I0930 18:18:21.444655 4772 generic.go:334] "Generic (PLEG): container finished" podID="53d8cae8-3ee9-4778-993c-07903200861f" containerID="bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603" exitCode=0 Sep 30 18:18:21 crc kubenswrapper[4772]: I0930 18:18:21.444709 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45k5" event={"ID":"53d8cae8-3ee9-4778-993c-07903200861f","Type":"ContainerDied","Data":"bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603"} Sep 30 18:18:21 crc kubenswrapper[4772]: I0930 18:18:21.444989 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45k5" event={"ID":"53d8cae8-3ee9-4778-993c-07903200861f","Type":"ContainerStarted","Data":"132a9ffd48c51e7408d9ea8e2f3b8643fda650582bfc4325a256978f5df2f94b"} Sep 30 18:18:21 crc kubenswrapper[4772]: I0930 18:18:21.446924 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:18:23 crc kubenswrapper[4772]: I0930 18:18:23.466896 4772 generic.go:334] "Generic (PLEG): container finished" podID="53d8cae8-3ee9-4778-993c-07903200861f" containerID="6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900" exitCode=0 Sep 30 18:18:23 crc kubenswrapper[4772]: I0930 18:18:23.467124 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45k5" event={"ID":"53d8cae8-3ee9-4778-993c-07903200861f","Type":"ContainerDied","Data":"6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900"} Sep 30 18:18:24 crc kubenswrapper[4772]: I0930 18:18:24.479564 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45k5" event={"ID":"53d8cae8-3ee9-4778-993c-07903200861f","Type":"ContainerStarted","Data":"cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1"} Sep 30 18:18:24 crc kubenswrapper[4772]: I0930 18:18:24.498006 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l45k5" podStartSLOduration=2.807301253 podStartE2EDuration="5.49798742s" podCreationTimestamp="2025-09-30 18:18:19 +0000 UTC" firstStartedPulling="2025-09-30 18:18:21.446668859 +0000 UTC m=+4602.353681690" lastFinishedPulling="2025-09-30 18:18:24.137355036 +0000 UTC m=+4605.044367857" observedRunningTime="2025-09-30 18:18:24.496429849 +0000 UTC m=+4605.403442710" watchObservedRunningTime="2025-09-30 18:18:24.49798742 +0000 UTC m=+4605.405000251" Sep 30 18:18:30 crc kubenswrapper[4772]: I0930 18:18:30.337691 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:30 crc kubenswrapper[4772]: I0930 18:18:30.338231 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:30 crc kubenswrapper[4772]: I0930 18:18:30.975862 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:31 crc kubenswrapper[4772]: I0930 18:18:31.062647 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:31 crc kubenswrapper[4772]: I0930 18:18:31.235491 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l45k5"] Sep 30 18:18:31 crc kubenswrapper[4772]: I0930 18:18:31.898515 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:18:31 crc kubenswrapper[4772]: E0930 18:18:31.899046 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:18:32 crc kubenswrapper[4772]: I0930 18:18:32.543976 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l45k5" podUID="53d8cae8-3ee9-4778-993c-07903200861f" containerName="registry-server" containerID="cri-o://cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1" gracePeriod=2 Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.037008 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.087856 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-catalog-content\") pod \"53d8cae8-3ee9-4778-993c-07903200861f\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.088039 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtgr5\" (UniqueName: \"kubernetes.io/projected/53d8cae8-3ee9-4778-993c-07903200861f-kube-api-access-gtgr5\") pod \"53d8cae8-3ee9-4778-993c-07903200861f\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.088295 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-utilities\") pod \"53d8cae8-3ee9-4778-993c-07903200861f\" (UID: \"53d8cae8-3ee9-4778-993c-07903200861f\") " Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.092729 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-utilities" (OuterVolumeSpecName: "utilities") pod "53d8cae8-3ee9-4778-993c-07903200861f" (UID: "53d8cae8-3ee9-4778-993c-07903200861f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.097405 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d8cae8-3ee9-4778-993c-07903200861f-kube-api-access-gtgr5" (OuterVolumeSpecName: "kube-api-access-gtgr5") pod "53d8cae8-3ee9-4778-993c-07903200861f" (UID: "53d8cae8-3ee9-4778-993c-07903200861f"). InnerVolumeSpecName "kube-api-access-gtgr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.147554 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53d8cae8-3ee9-4778-993c-07903200861f" (UID: "53d8cae8-3ee9-4778-993c-07903200861f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.191421 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.191473 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53d8cae8-3ee9-4778-993c-07903200861f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.191490 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtgr5\" (UniqueName: \"kubernetes.io/projected/53d8cae8-3ee9-4778-993c-07903200861f-kube-api-access-gtgr5\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.556759 4772 generic.go:334] "Generic (PLEG): container finished" podID="53d8cae8-3ee9-4778-993c-07903200861f" containerID="cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1" exitCode=0 Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.556830 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45k5" event={"ID":"53d8cae8-3ee9-4778-993c-07903200861f","Type":"ContainerDied","Data":"cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1"} Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.556869 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45k5" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.556900 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45k5" event={"ID":"53d8cae8-3ee9-4778-993c-07903200861f","Type":"ContainerDied","Data":"132a9ffd48c51e7408d9ea8e2f3b8643fda650582bfc4325a256978f5df2f94b"} Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.556925 4772 scope.go:117] "RemoveContainer" containerID="cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.581999 4772 scope.go:117] "RemoveContainer" containerID="6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.602498 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l45k5"] Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.616109 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l45k5"] Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.648393 4772 scope.go:117] "RemoveContainer" containerID="bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.685076 4772 scope.go:117] "RemoveContainer" containerID="cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1" Sep 30 18:18:33 crc kubenswrapper[4772]: E0930 18:18:33.687638 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1\": container with ID starting with cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1 not found: ID does not exist" containerID="cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.687724 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1"} err="failed to get container status \"cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1\": rpc error: code = NotFound desc = could not find container \"cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1\": container with ID starting with cc8a01e38528909b263336e6f17f47c640e84a8b8bee8abfd5bfb8b3f40bded1 not found: ID does not exist" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.687768 4772 scope.go:117] "RemoveContainer" containerID="6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900" Sep 30 18:18:33 crc kubenswrapper[4772]: E0930 18:18:33.688265 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900\": container with ID starting with 6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900 not found: ID does not exist" containerID="6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.688307 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900"} err="failed to get container status \"6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900\": rpc error: code = NotFound desc = could not find container \"6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900\": container with ID starting with 6b2d73a0695e26a64b1ef7f57a70eade0792e14ee4879117aa4aa636d3a27900 not found: ID does not exist" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.688335 4772 scope.go:117] "RemoveContainer" containerID="bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603" Sep 30 18:18:33 crc kubenswrapper[4772]: E0930 18:18:33.688710 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603\": container with ID starting with bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603 not found: ID does not exist" containerID="bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.688751 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603"} err="failed to get container status \"bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603\": rpc error: code = NotFound desc = could not find container \"bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603\": container with ID starting with bbc7a7d5ea9d131564032050075635d1f443c450c262a49fd353cc98c8eb4603 not found: ID does not exist" Sep 30 18:18:33 crc kubenswrapper[4772]: I0930 18:18:33.913561 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d8cae8-3ee9-4778-993c-07903200861f" path="/var/lib/kubelet/pods/53d8cae8-3ee9-4778-993c-07903200861f/volumes" Sep 30 18:18:42 crc kubenswrapper[4772]: I0930 18:18:42.899197 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:18:42 crc kubenswrapper[4772]: E0930 18:18:42.901320 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:18:53 crc kubenswrapper[4772]: I0930 18:18:53.898378 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:18:53 crc kubenswrapper[4772]: E0930 18:18:53.899189 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:19:07 crc kubenswrapper[4772]: I0930 18:19:07.898765 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:19:07 crc kubenswrapper[4772]: E0930 18:19:07.899734 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:19:18 crc kubenswrapper[4772]: I0930 18:19:18.898859 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:19:18 crc kubenswrapper[4772]: E0930 18:19:18.901022 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:19:29 crc kubenswrapper[4772]: I0930 18:19:29.906467 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:19:29 crc kubenswrapper[4772]: E0930 18:19:29.907317 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:19:44 crc kubenswrapper[4772]: I0930 18:19:44.899016 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:19:44 crc kubenswrapper[4772]: E0930 18:19:44.900388 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:19:58 crc kubenswrapper[4772]: I0930 18:19:58.899341 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:19:58 crc kubenswrapper[4772]: E0930 18:19:58.901235 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:20:11 crc kubenswrapper[4772]: I0930 18:20:11.898035 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:20:11 crc kubenswrapper[4772]: E0930 18:20:11.898839 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:20:24 crc kubenswrapper[4772]: I0930 18:20:24.899236 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:20:24 crc kubenswrapper[4772]: E0930 18:20:24.900465 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:20:39 crc kubenswrapper[4772]: I0930 18:20:39.910406 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:20:40 crc kubenswrapper[4772]: I0930 18:20:40.836473 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"57601f64b0fcfb491e76d7f6eb786f6ee1666221cb1883d857964c25fac40c7f"} Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.366152 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bjn85"] Sep 30 18:21:06 crc kubenswrapper[4772]: E0930 18:21:06.367224 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d8cae8-3ee9-4778-993c-07903200861f" containerName="extract-utilities" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.367241 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d8cae8-3ee9-4778-993c-07903200861f" containerName="extract-utilities" Sep 30 18:21:06 crc kubenswrapper[4772]: E0930 18:21:06.367259 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d8cae8-3ee9-4778-993c-07903200861f" containerName="extract-content" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.367266 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d8cae8-3ee9-4778-993c-07903200861f" containerName="extract-content" Sep 30 18:21:06 crc kubenswrapper[4772]: E0930 18:21:06.367296 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d8cae8-3ee9-4778-993c-07903200861f" containerName="registry-server" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.367305 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d8cae8-3ee9-4778-993c-07903200861f" containerName="registry-server" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.367517 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="53d8cae8-3ee9-4778-993c-07903200861f" containerName="registry-server" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.374731 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.385367 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjn85"] Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.508952 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tbc2\" (UniqueName: \"kubernetes.io/projected/49711f96-0b84-40b4-8db4-e706fb2a4279-kube-api-access-7tbc2\") pod \"redhat-marketplace-bjn85\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.509576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-catalog-content\") pod \"redhat-marketplace-bjn85\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.509890 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-utilities\") pod \"redhat-marketplace-bjn85\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.612255 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tbc2\" (UniqueName: \"kubernetes.io/projected/49711f96-0b84-40b4-8db4-e706fb2a4279-kube-api-access-7tbc2\") pod \"redhat-marketplace-bjn85\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.612384 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-catalog-content\") pod \"redhat-marketplace-bjn85\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.612471 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-utilities\") pod \"redhat-marketplace-bjn85\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.612810 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-catalog-content\") pod \"redhat-marketplace-bjn85\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.612900 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-utilities\") pod \"redhat-marketplace-bjn85\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.649958 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tbc2\" (UniqueName: \"kubernetes.io/projected/49711f96-0b84-40b4-8db4-e706fb2a4279-kube-api-access-7tbc2\") pod \"redhat-marketplace-bjn85\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:06 crc kubenswrapper[4772]: I0930 18:21:06.707298 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:07 crc kubenswrapper[4772]: I0930 18:21:07.250190 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjn85"] Sep 30 18:21:08 crc kubenswrapper[4772]: I0930 18:21:08.112560 4772 generic.go:334] "Generic (PLEG): container finished" podID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerID="db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2" exitCode=0 Sep 30 18:21:08 crc kubenswrapper[4772]: I0930 18:21:08.112694 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjn85" event={"ID":"49711f96-0b84-40b4-8db4-e706fb2a4279","Type":"ContainerDied","Data":"db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2"} Sep 30 18:21:08 crc kubenswrapper[4772]: I0930 18:21:08.113123 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjn85" event={"ID":"49711f96-0b84-40b4-8db4-e706fb2a4279","Type":"ContainerStarted","Data":"b8240ce8fb9e932419d180b75249a8a620dc48f46a30be5b3f481aaa6bbc7ea8"} Sep 30 18:21:09 crc kubenswrapper[4772]: I0930 18:21:09.126483 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjn85" event={"ID":"49711f96-0b84-40b4-8db4-e706fb2a4279","Type":"ContainerStarted","Data":"7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84"} Sep 30 18:21:10 crc kubenswrapper[4772]: I0930 18:21:10.138742 4772 generic.go:334] "Generic (PLEG): container finished" podID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerID="7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84" exitCode=0 Sep 30 18:21:10 crc kubenswrapper[4772]: I0930 18:21:10.138843 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjn85" event={"ID":"49711f96-0b84-40b4-8db4-e706fb2a4279","Type":"ContainerDied","Data":"7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84"} Sep 30 18:21:11 crc kubenswrapper[4772]: I0930 18:21:11.151275 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjn85" event={"ID":"49711f96-0b84-40b4-8db4-e706fb2a4279","Type":"ContainerStarted","Data":"8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090"} Sep 30 18:21:11 crc kubenswrapper[4772]: I0930 18:21:11.170742 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bjn85" podStartSLOduration=2.686264937 podStartE2EDuration="5.170722316s" podCreationTimestamp="2025-09-30 18:21:06 +0000 UTC" firstStartedPulling="2025-09-30 18:21:08.115130272 +0000 UTC m=+4769.022143103" lastFinishedPulling="2025-09-30 18:21:10.599587651 +0000 UTC m=+4771.506600482" observedRunningTime="2025-09-30 18:21:11.166433783 +0000 UTC m=+4772.073446614" watchObservedRunningTime="2025-09-30 18:21:11.170722316 +0000 UTC m=+4772.077735147" Sep 30 18:21:16 crc kubenswrapper[4772]: I0930 18:21:16.707863 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:16 crc kubenswrapper[4772]: I0930 18:21:16.708425 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:16 crc kubenswrapper[4772]: I0930 18:21:16.788851 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:17 crc kubenswrapper[4772]: I0930 18:21:17.859274 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:17 crc kubenswrapper[4772]: I0930 18:21:17.915922 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjn85"] Sep 30 18:21:19 crc kubenswrapper[4772]: I0930 18:21:19.263248 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bjn85" podUID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerName="registry-server" containerID="cri-o://8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090" gracePeriod=2 Sep 30 18:21:19 crc kubenswrapper[4772]: I0930 18:21:19.850266 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:19 crc kubenswrapper[4772]: I0930 18:21:19.966012 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tbc2\" (UniqueName: \"kubernetes.io/projected/49711f96-0b84-40b4-8db4-e706fb2a4279-kube-api-access-7tbc2\") pod \"49711f96-0b84-40b4-8db4-e706fb2a4279\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " Sep 30 18:21:19 crc kubenswrapper[4772]: I0930 18:21:19.966352 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-catalog-content\") pod \"49711f96-0b84-40b4-8db4-e706fb2a4279\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " Sep 30 18:21:19 crc kubenswrapper[4772]: I0930 18:21:19.966429 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-utilities\") pod \"49711f96-0b84-40b4-8db4-e706fb2a4279\" (UID: \"49711f96-0b84-40b4-8db4-e706fb2a4279\") " Sep 30 18:21:19 crc kubenswrapper[4772]: I0930 18:21:19.971307 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-utilities" (OuterVolumeSpecName: "utilities") pod "49711f96-0b84-40b4-8db4-e706fb2a4279" (UID: "49711f96-0b84-40b4-8db4-e706fb2a4279"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:21:19 crc kubenswrapper[4772]: I0930 18:21:19.982430 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49711f96-0b84-40b4-8db4-e706fb2a4279" (UID: "49711f96-0b84-40b4-8db4-e706fb2a4279"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.070046 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.070164 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49711f96-0b84-40b4-8db4-e706fb2a4279-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.278652 4772 generic.go:334] "Generic (PLEG): container finished" podID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerID="8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090" exitCode=0 Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.278730 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjn85" event={"ID":"49711f96-0b84-40b4-8db4-e706fb2a4279","Type":"ContainerDied","Data":"8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090"} Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.278752 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjn85" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.278790 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjn85" event={"ID":"49711f96-0b84-40b4-8db4-e706fb2a4279","Type":"ContainerDied","Data":"b8240ce8fb9e932419d180b75249a8a620dc48f46a30be5b3f481aaa6bbc7ea8"} Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.278825 4772 scope.go:117] "RemoveContainer" containerID="8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.299184 4772 scope.go:117] "RemoveContainer" containerID="7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.319853 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49711f96-0b84-40b4-8db4-e706fb2a4279-kube-api-access-7tbc2" (OuterVolumeSpecName: "kube-api-access-7tbc2") pod "49711f96-0b84-40b4-8db4-e706fb2a4279" (UID: "49711f96-0b84-40b4-8db4-e706fb2a4279"). InnerVolumeSpecName "kube-api-access-7tbc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.331017 4772 scope.go:117] "RemoveContainer" containerID="db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.379181 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tbc2\" (UniqueName: \"kubernetes.io/projected/49711f96-0b84-40b4-8db4-e706fb2a4279-kube-api-access-7tbc2\") on node \"crc\" DevicePath \"\"" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.424200 4772 scope.go:117] "RemoveContainer" containerID="8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090" Sep 30 18:21:20 crc kubenswrapper[4772]: E0930 18:21:20.424669 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090\": container with ID starting with 8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090 not found: ID does not exist" containerID="8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.424772 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090"} err="failed to get container status \"8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090\": rpc error: code = NotFound desc = could not find container \"8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090\": container with ID starting with 8e268890772ce58b54b7af796bb99d13c18786b44043d7ae7509cad82dfa1090 not found: ID does not exist" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.424869 4772 scope.go:117] "RemoveContainer" containerID="7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84" Sep 30 18:21:20 crc kubenswrapper[4772]: E0930 18:21:20.425406 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84\": container with ID starting with 7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84 not found: ID does not exist" containerID="7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.425470 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84"} err="failed to get container status \"7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84\": rpc error: code = NotFound desc = could not find container \"7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84\": container with ID starting with 7d8ded9d78302cbf4965b007bde769198418141e91a1d74d170b62dea8831f84 not found: ID does not exist" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.425511 4772 scope.go:117] "RemoveContainer" containerID="db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2" Sep 30 18:21:20 crc kubenswrapper[4772]: E0930 18:21:20.426172 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2\": container with ID starting with db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2 not found: ID does not exist" containerID="db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.426203 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2"} err="failed to get container status \"db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2\": rpc error: code = NotFound desc = could not find container \"db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2\": container with ID starting with db22de703a8e3ea38a42ece1554b08a5773fbed082ce4559ea579ba447d6dbc2 not found: ID does not exist" Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.620669 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjn85"] Sep 30 18:21:20 crc kubenswrapper[4772]: I0930 18:21:20.629371 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjn85"] Sep 30 18:21:21 crc kubenswrapper[4772]: I0930 18:21:21.914371 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49711f96-0b84-40b4-8db4-e706fb2a4279" path="/var/lib/kubelet/pods/49711f96-0b84-40b4-8db4-e706fb2a4279/volumes" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.627698 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rjsss"] Sep 30 18:22:14 crc kubenswrapper[4772]: E0930 18:22:14.628861 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerName="extract-content" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.628878 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerName="extract-content" Sep 30 18:22:14 crc kubenswrapper[4772]: E0930 18:22:14.628900 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerName="registry-server" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.628908 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerName="registry-server" Sep 30 18:22:14 crc kubenswrapper[4772]: E0930 18:22:14.628938 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerName="extract-utilities" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.628946 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerName="extract-utilities" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.629206 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="49711f96-0b84-40b4-8db4-e706fb2a4279" containerName="registry-server" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.630980 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.646714 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rjsss"] Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.699900 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-catalog-content\") pod \"redhat-operators-rjsss\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.700432 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvlgn\" (UniqueName: \"kubernetes.io/projected/469ca2ab-45d5-4f03-b5c7-a2cead510389-kube-api-access-pvlgn\") pod \"redhat-operators-rjsss\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.700520 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-utilities\") pod \"redhat-operators-rjsss\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.803767 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvlgn\" (UniqueName: \"kubernetes.io/projected/469ca2ab-45d5-4f03-b5c7-a2cead510389-kube-api-access-pvlgn\") pod \"redhat-operators-rjsss\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.803900 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-utilities\") pod \"redhat-operators-rjsss\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.803953 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-catalog-content\") pod \"redhat-operators-rjsss\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.804656 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-catalog-content\") pod \"redhat-operators-rjsss\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.804806 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-utilities\") pod \"redhat-operators-rjsss\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.838090 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvlgn\" (UniqueName: \"kubernetes.io/projected/469ca2ab-45d5-4f03-b5c7-a2cead510389-kube-api-access-pvlgn\") pod \"redhat-operators-rjsss\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:14 crc kubenswrapper[4772]: I0930 18:22:14.967908 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:15 crc kubenswrapper[4772]: I0930 18:22:15.445536 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rjsss"] Sep 30 18:22:15 crc kubenswrapper[4772]: I0930 18:22:15.853570 4772 generic.go:334] "Generic (PLEG): container finished" podID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerID="7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be" exitCode=0 Sep 30 18:22:15 crc kubenswrapper[4772]: I0930 18:22:15.853645 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjsss" event={"ID":"469ca2ab-45d5-4f03-b5c7-a2cead510389","Type":"ContainerDied","Data":"7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be"} Sep 30 18:22:15 crc kubenswrapper[4772]: I0930 18:22:15.854015 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjsss" event={"ID":"469ca2ab-45d5-4f03-b5c7-a2cead510389","Type":"ContainerStarted","Data":"f921ee3978133279765db76f71ee89db4a992b90a5c120b265d8d160f4e69374"} Sep 30 18:22:17 crc kubenswrapper[4772]: I0930 18:22:17.876133 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjsss" event={"ID":"469ca2ab-45d5-4f03-b5c7-a2cead510389","Type":"ContainerStarted","Data":"39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916"} Sep 30 18:22:20 crc kubenswrapper[4772]: I0930 18:22:20.905549 4772 generic.go:334] "Generic (PLEG): container finished" podID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerID="39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916" exitCode=0 Sep 30 18:22:20 crc kubenswrapper[4772]: I0930 18:22:20.905633 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjsss" event={"ID":"469ca2ab-45d5-4f03-b5c7-a2cead510389","Type":"ContainerDied","Data":"39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916"} Sep 30 18:22:21 crc kubenswrapper[4772]: I0930 18:22:21.918441 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjsss" event={"ID":"469ca2ab-45d5-4f03-b5c7-a2cead510389","Type":"ContainerStarted","Data":"fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24"} Sep 30 18:22:21 crc kubenswrapper[4772]: I0930 18:22:21.993471 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rjsss" podStartSLOduration=2.559440493 podStartE2EDuration="7.993441551s" podCreationTimestamp="2025-09-30 18:22:14 +0000 UTC" firstStartedPulling="2025-09-30 18:22:15.855954042 +0000 UTC m=+4836.762966873" lastFinishedPulling="2025-09-30 18:22:21.2899551 +0000 UTC m=+4842.196967931" observedRunningTime="2025-09-30 18:22:21.961853961 +0000 UTC m=+4842.868866792" watchObservedRunningTime="2025-09-30 18:22:21.993441551 +0000 UTC m=+4842.900454392" Sep 30 18:22:24 crc kubenswrapper[4772]: I0930 18:22:24.969264 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:24 crc kubenswrapper[4772]: I0930 18:22:24.969872 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:26 crc kubenswrapper[4772]: I0930 18:22:26.015813 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rjsss" podUID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerName="registry-server" probeResult="failure" output=< Sep 30 18:22:26 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Sep 30 18:22:26 crc kubenswrapper[4772]: > Sep 30 18:22:35 crc kubenswrapper[4772]: I0930 18:22:35.022212 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:35 crc kubenswrapper[4772]: I0930 18:22:35.071487 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:35 crc kubenswrapper[4772]: I0930 18:22:35.257088 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rjsss"] Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.061443 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rjsss" podUID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerName="registry-server" containerID="cri-o://fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24" gracePeriod=2 Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.597171 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.707519 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-catalog-content\") pod \"469ca2ab-45d5-4f03-b5c7-a2cead510389\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.707622 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvlgn\" (UniqueName: \"kubernetes.io/projected/469ca2ab-45d5-4f03-b5c7-a2cead510389-kube-api-access-pvlgn\") pod \"469ca2ab-45d5-4f03-b5c7-a2cead510389\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.707767 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-utilities\") pod \"469ca2ab-45d5-4f03-b5c7-a2cead510389\" (UID: \"469ca2ab-45d5-4f03-b5c7-a2cead510389\") " Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.710150 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-utilities" (OuterVolumeSpecName: "utilities") pod "469ca2ab-45d5-4f03-b5c7-a2cead510389" (UID: "469ca2ab-45d5-4f03-b5c7-a2cead510389"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.718813 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469ca2ab-45d5-4f03-b5c7-a2cead510389-kube-api-access-pvlgn" (OuterVolumeSpecName: "kube-api-access-pvlgn") pod "469ca2ab-45d5-4f03-b5c7-a2cead510389" (UID: "469ca2ab-45d5-4f03-b5c7-a2cead510389"). InnerVolumeSpecName "kube-api-access-pvlgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.790390 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "469ca2ab-45d5-4f03-b5c7-a2cead510389" (UID: "469ca2ab-45d5-4f03-b5c7-a2cead510389"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.811163 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.811500 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvlgn\" (UniqueName: \"kubernetes.io/projected/469ca2ab-45d5-4f03-b5c7-a2cead510389-kube-api-access-pvlgn\") on node \"crc\" DevicePath \"\"" Sep 30 18:22:36 crc kubenswrapper[4772]: I0930 18:22:36.811510 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/469ca2ab-45d5-4f03-b5c7-a2cead510389-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.076983 4772 generic.go:334] "Generic (PLEG): container finished" podID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerID="fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24" exitCode=0 Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.077041 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjsss" event={"ID":"469ca2ab-45d5-4f03-b5c7-a2cead510389","Type":"ContainerDied","Data":"fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24"} Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.077106 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjsss" event={"ID":"469ca2ab-45d5-4f03-b5c7-a2cead510389","Type":"ContainerDied","Data":"f921ee3978133279765db76f71ee89db4a992b90a5c120b265d8d160f4e69374"} Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.077106 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjsss" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.077131 4772 scope.go:117] "RemoveContainer" containerID="fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.113761 4772 scope.go:117] "RemoveContainer" containerID="39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.162134 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rjsss"] Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.175496 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rjsss"] Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.245449 4772 scope.go:117] "RemoveContainer" containerID="7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.314516 4772 scope.go:117] "RemoveContainer" containerID="fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24" Sep 30 18:22:37 crc kubenswrapper[4772]: E0930 18:22:37.319256 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24\": container with ID starting with fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24 not found: ID does not exist" containerID="fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.319408 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24"} err="failed to get container status \"fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24\": rpc error: code = NotFound desc = could not find container \"fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24\": container with ID starting with fb134398b7f20c5757537f3d4a9102bb4ff0cd43fd24886ff01340bc6e761e24 not found: ID does not exist" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.319455 4772 scope.go:117] "RemoveContainer" containerID="39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916" Sep 30 18:22:37 crc kubenswrapper[4772]: E0930 18:22:37.320001 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916\": container with ID starting with 39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916 not found: ID does not exist" containerID="39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.320050 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916"} err="failed to get container status \"39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916\": rpc error: code = NotFound desc = could not find container \"39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916\": container with ID starting with 39a59d1e6f9966e46987973dbe4d7562e6f6f1466a60ace39bea2f31d2f83916 not found: ID does not exist" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.320086 4772 scope.go:117] "RemoveContainer" containerID="7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be" Sep 30 18:22:37 crc kubenswrapper[4772]: E0930 18:22:37.320798 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be\": container with ID starting with 7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be not found: ID does not exist" containerID="7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.320835 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be"} err="failed to get container status \"7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be\": rpc error: code = NotFound desc = could not find container \"7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be\": container with ID starting with 7d6cd4e90342702820f6cea9be6d65acaab5d60ab4462199f46415825d9ad1be not found: ID does not exist" Sep 30 18:22:37 crc kubenswrapper[4772]: I0930 18:22:37.916769 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="469ca2ab-45d5-4f03-b5c7-a2cead510389" path="/var/lib/kubelet/pods/469ca2ab-45d5-4f03-b5c7-a2cead510389/volumes" Sep 30 18:23:08 crc kubenswrapper[4772]: I0930 18:23:08.655426 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:23:08 crc kubenswrapper[4772]: I0930 18:23:08.656082 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:23:38 crc kubenswrapper[4772]: I0930 18:23:38.655609 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:23:38 crc kubenswrapper[4772]: I0930 18:23:38.656255 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:24:08 crc kubenswrapper[4772]: I0930 18:24:08.655798 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:24:08 crc kubenswrapper[4772]: I0930 18:24:08.656609 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:24:08 crc kubenswrapper[4772]: I0930 18:24:08.656675 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 18:24:08 crc kubenswrapper[4772]: I0930 18:24:08.657808 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57601f64b0fcfb491e76d7f6eb786f6ee1666221cb1883d857964c25fac40c7f"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:24:08 crc kubenswrapper[4772]: I0930 18:24:08.657904 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://57601f64b0fcfb491e76d7f6eb786f6ee1666221cb1883d857964c25fac40c7f" gracePeriod=600 Sep 30 18:24:09 crc kubenswrapper[4772]: I0930 18:24:09.120989 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="57601f64b0fcfb491e76d7f6eb786f6ee1666221cb1883d857964c25fac40c7f" exitCode=0 Sep 30 18:24:09 crc kubenswrapper[4772]: I0930 18:24:09.121272 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"57601f64b0fcfb491e76d7f6eb786f6ee1666221cb1883d857964c25fac40c7f"} Sep 30 18:24:09 crc kubenswrapper[4772]: I0930 18:24:09.121623 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d"} Sep 30 18:24:09 crc kubenswrapper[4772]: I0930 18:24:09.121655 4772 scope.go:117] "RemoveContainer" containerID="a0d79aec6fd3c98b11e8f78b486d6fad7aaa0692e917bb14069b370bd9bed3a4" Sep 30 18:26:38 crc kubenswrapper[4772]: I0930 18:26:38.654851 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:26:38 crc kubenswrapper[4772]: I0930 18:26:38.655589 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:27:08 crc kubenswrapper[4772]: I0930 18:27:08.655165 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:27:08 crc kubenswrapper[4772]: I0930 18:27:08.655868 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:27:38 crc kubenswrapper[4772]: I0930 18:27:38.655199 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:27:38 crc kubenswrapper[4772]: I0930 18:27:38.656010 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:27:38 crc kubenswrapper[4772]: I0930 18:27:38.656084 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 18:27:38 crc kubenswrapper[4772]: I0930 18:27:38.656677 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:27:38 crc kubenswrapper[4772]: I0930 18:27:38.656748 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" gracePeriod=600 Sep 30 18:27:38 crc kubenswrapper[4772]: E0930 18:27:38.789910 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:27:39 crc kubenswrapper[4772]: I0930 18:27:39.522700 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" exitCode=0 Sep 30 18:27:39 crc kubenswrapper[4772]: I0930 18:27:39.522812 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d"} Sep 30 18:27:39 crc kubenswrapper[4772]: I0930 18:27:39.523210 4772 scope.go:117] "RemoveContainer" containerID="57601f64b0fcfb491e76d7f6eb786f6ee1666221cb1883d857964c25fac40c7f" Sep 30 18:27:39 crc kubenswrapper[4772]: I0930 18:27:39.524414 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:27:39 crc kubenswrapper[4772]: E0930 18:27:39.525039 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:27:53 crc kubenswrapper[4772]: I0930 18:27:53.899025 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:27:53 crc kubenswrapper[4772]: E0930 18:27:53.900105 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:28:04 crc kubenswrapper[4772]: I0930 18:28:04.899773 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:28:04 crc kubenswrapper[4772]: E0930 18:28:04.901810 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:28:19 crc kubenswrapper[4772]: I0930 18:28:19.905291 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:28:19 crc kubenswrapper[4772]: E0930 18:28:19.906883 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.606325 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kpxmp"] Sep 30 18:28:22 crc kubenswrapper[4772]: E0930 18:28:22.607667 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerName="extract-content" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.607688 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerName="extract-content" Sep 30 18:28:22 crc kubenswrapper[4772]: E0930 18:28:22.607711 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerName="registry-server" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.607719 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerName="registry-server" Sep 30 18:28:22 crc kubenswrapper[4772]: E0930 18:28:22.607753 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerName="extract-utilities" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.607763 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerName="extract-utilities" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.608024 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="469ca2ab-45d5-4f03-b5c7-a2cead510389" containerName="registry-server" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.610073 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.627785 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kpxmp"] Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.733347 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxxrh\" (UniqueName: \"kubernetes.io/projected/805d56c6-2564-4e9d-a87b-653d6a15554c-kube-api-access-rxxrh\") pod \"community-operators-kpxmp\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.733686 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-utilities\") pod \"community-operators-kpxmp\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.733822 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-catalog-content\") pod \"community-operators-kpxmp\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.836886 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-catalog-content\") pod \"community-operators-kpxmp\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.837274 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxxrh\" (UniqueName: \"kubernetes.io/projected/805d56c6-2564-4e9d-a87b-653d6a15554c-kube-api-access-rxxrh\") pod \"community-operators-kpxmp\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.837487 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-catalog-content\") pod \"community-operators-kpxmp\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.842332 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-utilities\") pod \"community-operators-kpxmp\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.842802 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-utilities\") pod \"community-operators-kpxmp\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.862337 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxxrh\" (UniqueName: \"kubernetes.io/projected/805d56c6-2564-4e9d-a87b-653d6a15554c-kube-api-access-rxxrh\") pod \"community-operators-kpxmp\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:22 crc kubenswrapper[4772]: I0930 18:28:22.934490 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:23 crc kubenswrapper[4772]: I0930 18:28:23.478885 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kpxmp"] Sep 30 18:28:24 crc kubenswrapper[4772]: I0930 18:28:24.058404 4772 generic.go:334] "Generic (PLEG): container finished" podID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerID="b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181" exitCode=0 Sep 30 18:28:24 crc kubenswrapper[4772]: I0930 18:28:24.058489 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpxmp" event={"ID":"805d56c6-2564-4e9d-a87b-653d6a15554c","Type":"ContainerDied","Data":"b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181"} Sep 30 18:28:24 crc kubenswrapper[4772]: I0930 18:28:24.058970 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpxmp" event={"ID":"805d56c6-2564-4e9d-a87b-653d6a15554c","Type":"ContainerStarted","Data":"3cb18fdedf3505cc3ddce42a11af4abc18df209c7ac2500ed915af4f2437dcf9"} Sep 30 18:28:24 crc kubenswrapper[4772]: I0930 18:28:24.062653 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:28:26 crc kubenswrapper[4772]: I0930 18:28:26.091111 4772 generic.go:334] "Generic (PLEG): container finished" podID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerID="90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f" exitCode=0 Sep 30 18:28:26 crc kubenswrapper[4772]: I0930 18:28:26.091934 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpxmp" event={"ID":"805d56c6-2564-4e9d-a87b-653d6a15554c","Type":"ContainerDied","Data":"90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f"} Sep 30 18:28:27 crc kubenswrapper[4772]: I0930 18:28:27.107668 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpxmp" event={"ID":"805d56c6-2564-4e9d-a87b-653d6a15554c","Type":"ContainerStarted","Data":"e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3"} Sep 30 18:28:27 crc kubenswrapper[4772]: I0930 18:28:27.134293 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kpxmp" podStartSLOduration=2.653470729 podStartE2EDuration="5.134275573s" podCreationTimestamp="2025-09-30 18:28:22 +0000 UTC" firstStartedPulling="2025-09-30 18:28:24.062334745 +0000 UTC m=+5204.969347596" lastFinishedPulling="2025-09-30 18:28:26.543139609 +0000 UTC m=+5207.450152440" observedRunningTime="2025-09-30 18:28:27.130398222 +0000 UTC m=+5208.037411053" watchObservedRunningTime="2025-09-30 18:28:27.134275573 +0000 UTC m=+5208.041288404" Sep 30 18:28:31 crc kubenswrapper[4772]: I0930 18:28:31.898636 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:28:31 crc kubenswrapper[4772]: E0930 18:28:31.900205 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:28:32 crc kubenswrapper[4772]: I0930 18:28:32.935978 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:32 crc kubenswrapper[4772]: I0930 18:28:32.936350 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:32 crc kubenswrapper[4772]: I0930 18:28:32.987311 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:33 crc kubenswrapper[4772]: I0930 18:28:33.250315 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:33 crc kubenswrapper[4772]: I0930 18:28:33.313002 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kpxmp"] Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.206046 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kpxmp" podUID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerName="registry-server" containerID="cri-o://e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3" gracePeriod=2 Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.698820 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.791070 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-catalog-content\") pod \"805d56c6-2564-4e9d-a87b-653d6a15554c\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.791572 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-utilities\") pod \"805d56c6-2564-4e9d-a87b-653d6a15554c\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.791882 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxxrh\" (UniqueName: \"kubernetes.io/projected/805d56c6-2564-4e9d-a87b-653d6a15554c-kube-api-access-rxxrh\") pod \"805d56c6-2564-4e9d-a87b-653d6a15554c\" (UID: \"805d56c6-2564-4e9d-a87b-653d6a15554c\") " Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.796476 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-utilities" (OuterVolumeSpecName: "utilities") pod "805d56c6-2564-4e9d-a87b-653d6a15554c" (UID: "805d56c6-2564-4e9d-a87b-653d6a15554c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.813670 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805d56c6-2564-4e9d-a87b-653d6a15554c-kube-api-access-rxxrh" (OuterVolumeSpecName: "kube-api-access-rxxrh") pod "805d56c6-2564-4e9d-a87b-653d6a15554c" (UID: "805d56c6-2564-4e9d-a87b-653d6a15554c"). InnerVolumeSpecName "kube-api-access-rxxrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.855569 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "805d56c6-2564-4e9d-a87b-653d6a15554c" (UID: "805d56c6-2564-4e9d-a87b-653d6a15554c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.894765 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxxrh\" (UniqueName: \"kubernetes.io/projected/805d56c6-2564-4e9d-a87b-653d6a15554c-kube-api-access-rxxrh\") on node \"crc\" DevicePath \"\"" Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.894828 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:28:35 crc kubenswrapper[4772]: I0930 18:28:35.894844 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805d56c6-2564-4e9d-a87b-653d6a15554c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.222795 4772 generic.go:334] "Generic (PLEG): container finished" podID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerID="e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3" exitCode=0 Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.222883 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpxmp" event={"ID":"805d56c6-2564-4e9d-a87b-653d6a15554c","Type":"ContainerDied","Data":"e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3"} Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.222944 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpxmp" event={"ID":"805d56c6-2564-4e9d-a87b-653d6a15554c","Type":"ContainerDied","Data":"3cb18fdedf3505cc3ddce42a11af4abc18df209c7ac2500ed915af4f2437dcf9"} Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.222985 4772 scope.go:117] "RemoveContainer" containerID="e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3" Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.222978 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kpxmp" Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.261728 4772 scope.go:117] "RemoveContainer" containerID="90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f" Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.261938 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kpxmp"] Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.275302 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kpxmp"] Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.282040 4772 scope.go:117] "RemoveContainer" containerID="b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181" Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.343876 4772 scope.go:117] "RemoveContainer" containerID="e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3" Sep 30 18:28:36 crc kubenswrapper[4772]: E0930 18:28:36.344740 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3\": container with ID starting with e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3 not found: ID does not exist" containerID="e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3" Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.344783 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3"} err="failed to get container status \"e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3\": rpc error: code = NotFound desc = could not find container \"e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3\": container with ID starting with e6967e57fdc842276f42cc65b8feca3aff12d4c84ec3b3611a1e4f87da1d26d3 not found: ID does not exist" Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.344811 4772 scope.go:117] "RemoveContainer" containerID="90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f" Sep 30 18:28:36 crc kubenswrapper[4772]: E0930 18:28:36.346080 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f\": container with ID starting with 90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f not found: ID does not exist" containerID="90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f" Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.346163 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f"} err="failed to get container status \"90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f\": rpc error: code = NotFound desc = could not find container \"90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f\": container with ID starting with 90c9098d41e48175baae837f1460cad5c591df0aeb5696a34b4c65d98c1a2d0f not found: ID does not exist" Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.346213 4772 scope.go:117] "RemoveContainer" containerID="b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181" Sep 30 18:28:36 crc kubenswrapper[4772]: E0930 18:28:36.346675 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181\": container with ID starting with b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181 not found: ID does not exist" containerID="b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181" Sep 30 18:28:36 crc kubenswrapper[4772]: I0930 18:28:36.346707 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181"} err="failed to get container status \"b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181\": rpc error: code = NotFound desc = could not find container \"b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181\": container with ID starting with b9db719dc4e6196ab66a5f86314ca7ea9f7bcf89430951487b0c26e746870181 not found: ID does not exist" Sep 30 18:28:37 crc kubenswrapper[4772]: I0930 18:28:37.911493 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="805d56c6-2564-4e9d-a87b-653d6a15554c" path="/var/lib/kubelet/pods/805d56c6-2564-4e9d-a87b-653d6a15554c/volumes" Sep 30 18:28:45 crc kubenswrapper[4772]: I0930 18:28:45.899354 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:28:45 crc kubenswrapper[4772]: E0930 18:28:45.900759 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:28:59 crc kubenswrapper[4772]: I0930 18:28:59.929184 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:28:59 crc kubenswrapper[4772]: E0930 18:28:59.931114 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:29:13 crc kubenswrapper[4772]: I0930 18:29:13.898434 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:29:13 crc kubenswrapper[4772]: E0930 18:29:13.899589 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:29:25 crc kubenswrapper[4772]: I0930 18:29:25.899453 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:29:25 crc kubenswrapper[4772]: E0930 18:29:25.900881 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:29:39 crc kubenswrapper[4772]: I0930 18:29:39.899009 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:29:39 crc kubenswrapper[4772]: E0930 18:29:39.900128 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:29:52 crc kubenswrapper[4772]: I0930 18:29:52.898684 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:29:52 crc kubenswrapper[4772]: E0930 18:29:52.900011 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.172022 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66"] Sep 30 18:30:00 crc kubenswrapper[4772]: E0930 18:30:00.173142 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerName="extract-content" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.173162 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerName="extract-content" Sep 30 18:30:00 crc kubenswrapper[4772]: E0930 18:30:00.173191 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerName="extract-utilities" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.173202 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerName="extract-utilities" Sep 30 18:30:00 crc kubenswrapper[4772]: E0930 18:30:00.173257 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerName="registry-server" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.173265 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerName="registry-server" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.173666 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="805d56c6-2564-4e9d-a87b-653d6a15554c" containerName="registry-server" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.174741 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.177902 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.178237 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.187729 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66"] Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.290602 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmrf\" (UniqueName: \"kubernetes.io/projected/9c7535f8-079f-475b-80e2-774473d91b62-kube-api-access-hkmrf\") pod \"collect-profiles-29320950-kmg66\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.291105 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c7535f8-079f-475b-80e2-774473d91b62-config-volume\") pod \"collect-profiles-29320950-kmg66\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.291198 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c7535f8-079f-475b-80e2-774473d91b62-secret-volume\") pod \"collect-profiles-29320950-kmg66\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.393671 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmrf\" (UniqueName: \"kubernetes.io/projected/9c7535f8-079f-475b-80e2-774473d91b62-kube-api-access-hkmrf\") pod \"collect-profiles-29320950-kmg66\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.393766 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c7535f8-079f-475b-80e2-774473d91b62-config-volume\") pod \"collect-profiles-29320950-kmg66\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.393855 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c7535f8-079f-475b-80e2-774473d91b62-secret-volume\") pod \"collect-profiles-29320950-kmg66\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.394872 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c7535f8-079f-475b-80e2-774473d91b62-config-volume\") pod \"collect-profiles-29320950-kmg66\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.402257 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c7535f8-079f-475b-80e2-774473d91b62-secret-volume\") pod \"collect-profiles-29320950-kmg66\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.412037 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmrf\" (UniqueName: \"kubernetes.io/projected/9c7535f8-079f-475b-80e2-774473d91b62-kube-api-access-hkmrf\") pod \"collect-profiles-29320950-kmg66\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:00 crc kubenswrapper[4772]: I0930 18:30:00.512904 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:01 crc kubenswrapper[4772]: I0930 18:30:01.030940 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66"] Sep 30 18:30:01 crc kubenswrapper[4772]: W0930 18:30:01.045080 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c7535f8_079f_475b_80e2_774473d91b62.slice/crio-f9a5f273a1b2c7f7fc3c2ba779c911cb38467c7407c0a2bfefda77da200e41af WatchSource:0}: Error finding container f9a5f273a1b2c7f7fc3c2ba779c911cb38467c7407c0a2bfefda77da200e41af: Status 404 returned error can't find the container with id f9a5f273a1b2c7f7fc3c2ba779c911cb38467c7407c0a2bfefda77da200e41af Sep 30 18:30:01 crc kubenswrapper[4772]: I0930 18:30:01.233700 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" event={"ID":"9c7535f8-079f-475b-80e2-774473d91b62","Type":"ContainerStarted","Data":"f9a5f273a1b2c7f7fc3c2ba779c911cb38467c7407c0a2bfefda77da200e41af"} Sep 30 18:30:02 crc kubenswrapper[4772]: I0930 18:30:02.249820 4772 generic.go:334] "Generic (PLEG): container finished" podID="9c7535f8-079f-475b-80e2-774473d91b62" containerID="3c4d4eab5f9822b6ed63c91a69828a833421a554457c9e5a0e010939a028f21d" exitCode=0 Sep 30 18:30:02 crc kubenswrapper[4772]: I0930 18:30:02.249925 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" event={"ID":"9c7535f8-079f-475b-80e2-774473d91b62","Type":"ContainerDied","Data":"3c4d4eab5f9822b6ed63c91a69828a833421a554457c9e5a0e010939a028f21d"} Sep 30 18:30:03 crc kubenswrapper[4772]: I0930 18:30:03.602286 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:03 crc kubenswrapper[4772]: I0930 18:30:03.686924 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c7535f8-079f-475b-80e2-774473d91b62-secret-volume\") pod \"9c7535f8-079f-475b-80e2-774473d91b62\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " Sep 30 18:30:03 crc kubenswrapper[4772]: I0930 18:30:03.687158 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkmrf\" (UniqueName: \"kubernetes.io/projected/9c7535f8-079f-475b-80e2-774473d91b62-kube-api-access-hkmrf\") pod \"9c7535f8-079f-475b-80e2-774473d91b62\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " Sep 30 18:30:03 crc kubenswrapper[4772]: I0930 18:30:03.687225 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c7535f8-079f-475b-80e2-774473d91b62-config-volume\") pod \"9c7535f8-079f-475b-80e2-774473d91b62\" (UID: \"9c7535f8-079f-475b-80e2-774473d91b62\") " Sep 30 18:30:03 crc kubenswrapper[4772]: I0930 18:30:03.688639 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7535f8-079f-475b-80e2-774473d91b62-config-volume" (OuterVolumeSpecName: "config-volume") pod "9c7535f8-079f-475b-80e2-774473d91b62" (UID: "9c7535f8-079f-475b-80e2-774473d91b62"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:30:03 crc kubenswrapper[4772]: I0930 18:30:03.693739 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7535f8-079f-475b-80e2-774473d91b62-kube-api-access-hkmrf" (OuterVolumeSpecName: "kube-api-access-hkmrf") pod "9c7535f8-079f-475b-80e2-774473d91b62" (UID: "9c7535f8-079f-475b-80e2-774473d91b62"). InnerVolumeSpecName "kube-api-access-hkmrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:30:03 crc kubenswrapper[4772]: I0930 18:30:03.694266 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c7535f8-079f-475b-80e2-774473d91b62-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9c7535f8-079f-475b-80e2-774473d91b62" (UID: "9c7535f8-079f-475b-80e2-774473d91b62"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:30:03 crc kubenswrapper[4772]: I0930 18:30:03.790222 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c7535f8-079f-475b-80e2-774473d91b62-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:30:03 crc kubenswrapper[4772]: I0930 18:30:03.790269 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkmrf\" (UniqueName: \"kubernetes.io/projected/9c7535f8-079f-475b-80e2-774473d91b62-kube-api-access-hkmrf\") on node \"crc\" DevicePath \"\"" Sep 30 18:30:03 crc kubenswrapper[4772]: I0930 18:30:03.790278 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c7535f8-079f-475b-80e2-774473d91b62-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.274014 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" event={"ID":"9c7535f8-079f-475b-80e2-774473d91b62","Type":"ContainerDied","Data":"f9a5f273a1b2c7f7fc3c2ba779c911cb38467c7407c0a2bfefda77da200e41af"} Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.274079 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9a5f273a1b2c7f7fc3c2ba779c911cb38467c7407c0a2bfefda77da200e41af" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.274142 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-kmg66" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.338996 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kwlds"] Sep 30 18:30:04 crc kubenswrapper[4772]: E0930 18:30:04.340481 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7535f8-079f-475b-80e2-774473d91b62" containerName="collect-profiles" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.340522 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7535f8-079f-475b-80e2-774473d91b62" containerName="collect-profiles" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.340800 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c7535f8-079f-475b-80e2-774473d91b62" containerName="collect-profiles" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.342733 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.352277 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kwlds"] Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.406209 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jfgk\" (UniqueName: \"kubernetes.io/projected/4bf2eb75-7120-499a-b055-790886fec2ae-kube-api-access-6jfgk\") pod \"certified-operators-kwlds\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.406320 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-utilities\") pod \"certified-operators-kwlds\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.406406 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-catalog-content\") pod \"certified-operators-kwlds\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.509245 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-utilities\") pod \"certified-operators-kwlds\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.509366 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-catalog-content\") pod \"certified-operators-kwlds\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.509455 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jfgk\" (UniqueName: \"kubernetes.io/projected/4bf2eb75-7120-499a-b055-790886fec2ae-kube-api-access-6jfgk\") pod \"certified-operators-kwlds\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.510447 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-utilities\") pod \"certified-operators-kwlds\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.510692 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-catalog-content\") pod \"certified-operators-kwlds\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.530044 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jfgk\" (UniqueName: \"kubernetes.io/projected/4bf2eb75-7120-499a-b055-790886fec2ae-kube-api-access-6jfgk\") pod \"certified-operators-kwlds\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.665194 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.704672 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8"] Sep 30 18:30:04 crc kubenswrapper[4772]: I0930 18:30:04.717161 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-tscv8"] Sep 30 18:30:05 crc kubenswrapper[4772]: I0930 18:30:05.277203 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kwlds"] Sep 30 18:30:05 crc kubenswrapper[4772]: W0930 18:30:05.283509 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bf2eb75_7120_499a_b055_790886fec2ae.slice/crio-bc3f0f4af26d6608820eb14c0b9523675f789f71272294927fc741a82657637c WatchSource:0}: Error finding container bc3f0f4af26d6608820eb14c0b9523675f789f71272294927fc741a82657637c: Status 404 returned error can't find the container with id bc3f0f4af26d6608820eb14c0b9523675f789f71272294927fc741a82657637c Sep 30 18:30:05 crc kubenswrapper[4772]: I0930 18:30:05.917621 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801a662c-6452-49ea-962b-2ebfbc394f8f" path="/var/lib/kubelet/pods/801a662c-6452-49ea-962b-2ebfbc394f8f/volumes" Sep 30 18:30:06 crc kubenswrapper[4772]: I0930 18:30:06.305796 4772 generic.go:334] "Generic (PLEG): container finished" podID="4bf2eb75-7120-499a-b055-790886fec2ae" containerID="9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876" exitCode=0 Sep 30 18:30:06 crc kubenswrapper[4772]: I0930 18:30:06.305864 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwlds" event={"ID":"4bf2eb75-7120-499a-b055-790886fec2ae","Type":"ContainerDied","Data":"9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876"} Sep 30 18:30:06 crc kubenswrapper[4772]: I0930 18:30:06.305903 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwlds" event={"ID":"4bf2eb75-7120-499a-b055-790886fec2ae","Type":"ContainerStarted","Data":"bc3f0f4af26d6608820eb14c0b9523675f789f71272294927fc741a82657637c"} Sep 30 18:30:07 crc kubenswrapper[4772]: I0930 18:30:07.899121 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:30:07 crc kubenswrapper[4772]: E0930 18:30:07.900173 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:30:08 crc kubenswrapper[4772]: I0930 18:30:08.327569 4772 generic.go:334] "Generic (PLEG): container finished" podID="4bf2eb75-7120-499a-b055-790886fec2ae" containerID="1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b" exitCode=0 Sep 30 18:30:08 crc kubenswrapper[4772]: I0930 18:30:08.327665 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwlds" event={"ID":"4bf2eb75-7120-499a-b055-790886fec2ae","Type":"ContainerDied","Data":"1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b"} Sep 30 18:30:09 crc kubenswrapper[4772]: I0930 18:30:09.341183 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwlds" event={"ID":"4bf2eb75-7120-499a-b055-790886fec2ae","Type":"ContainerStarted","Data":"c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967"} Sep 30 18:30:09 crc kubenswrapper[4772]: I0930 18:30:09.366130 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kwlds" podStartSLOduration=2.92327142 podStartE2EDuration="5.366112829s" podCreationTimestamp="2025-09-30 18:30:04 +0000 UTC" firstStartedPulling="2025-09-30 18:30:06.309825568 +0000 UTC m=+5307.216838399" lastFinishedPulling="2025-09-30 18:30:08.752666977 +0000 UTC m=+5309.659679808" observedRunningTime="2025-09-30 18:30:09.364019225 +0000 UTC m=+5310.271032076" watchObservedRunningTime="2025-09-30 18:30:09.366112829 +0000 UTC m=+5310.273125660" Sep 30 18:30:14 crc kubenswrapper[4772]: I0930 18:30:14.665858 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:14 crc kubenswrapper[4772]: I0930 18:30:14.666638 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:14 crc kubenswrapper[4772]: I0930 18:30:14.724798 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:15 crc kubenswrapper[4772]: I0930 18:30:15.472125 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:15 crc kubenswrapper[4772]: I0930 18:30:15.533257 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kwlds"] Sep 30 18:30:17 crc kubenswrapper[4772]: I0930 18:30:17.421878 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kwlds" podUID="4bf2eb75-7120-499a-b055-790886fec2ae" containerName="registry-server" containerID="cri-o://c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967" gracePeriod=2 Sep 30 18:30:17 crc kubenswrapper[4772]: I0930 18:30:17.914777 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:17 crc kubenswrapper[4772]: I0930 18:30:17.987611 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jfgk\" (UniqueName: \"kubernetes.io/projected/4bf2eb75-7120-499a-b055-790886fec2ae-kube-api-access-6jfgk\") pod \"4bf2eb75-7120-499a-b055-790886fec2ae\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " Sep 30 18:30:17 crc kubenswrapper[4772]: I0930 18:30:17.987795 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-catalog-content\") pod \"4bf2eb75-7120-499a-b055-790886fec2ae\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " Sep 30 18:30:17 crc kubenswrapper[4772]: I0930 18:30:17.988078 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-utilities\") pod \"4bf2eb75-7120-499a-b055-790886fec2ae\" (UID: \"4bf2eb75-7120-499a-b055-790886fec2ae\") " Sep 30 18:30:17 crc kubenswrapper[4772]: I0930 18:30:17.995404 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-utilities" (OuterVolumeSpecName: "utilities") pod "4bf2eb75-7120-499a-b055-790886fec2ae" (UID: "4bf2eb75-7120-499a-b055-790886fec2ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.003103 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bf2eb75-7120-499a-b055-790886fec2ae-kube-api-access-6jfgk" (OuterVolumeSpecName: "kube-api-access-6jfgk") pod "4bf2eb75-7120-499a-b055-790886fec2ae" (UID: "4bf2eb75-7120-499a-b055-790886fec2ae"). InnerVolumeSpecName "kube-api-access-6jfgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.091022 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.091089 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jfgk\" (UniqueName: \"kubernetes.io/projected/4bf2eb75-7120-499a-b055-790886fec2ae-kube-api-access-6jfgk\") on node \"crc\" DevicePath \"\"" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.150640 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bf2eb75-7120-499a-b055-790886fec2ae" (UID: "4bf2eb75-7120-499a-b055-790886fec2ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.193455 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bf2eb75-7120-499a-b055-790886fec2ae-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.432232 4772 generic.go:334] "Generic (PLEG): container finished" podID="4bf2eb75-7120-499a-b055-790886fec2ae" containerID="c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967" exitCode=0 Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.432281 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwlds" event={"ID":"4bf2eb75-7120-499a-b055-790886fec2ae","Type":"ContainerDied","Data":"c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967"} Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.432316 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwlds" event={"ID":"4bf2eb75-7120-499a-b055-790886fec2ae","Type":"ContainerDied","Data":"bc3f0f4af26d6608820eb14c0b9523675f789f71272294927fc741a82657637c"} Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.432334 4772 scope.go:117] "RemoveContainer" containerID="c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.432341 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwlds" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.450897 4772 scope.go:117] "RemoveContainer" containerID="1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.482478 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kwlds"] Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.482684 4772 scope.go:117] "RemoveContainer" containerID="9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.495587 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kwlds"] Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.528408 4772 scope.go:117] "RemoveContainer" containerID="c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967" Sep 30 18:30:18 crc kubenswrapper[4772]: E0930 18:30:18.529018 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967\": container with ID starting with c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967 not found: ID does not exist" containerID="c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.529051 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967"} err="failed to get container status \"c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967\": rpc error: code = NotFound desc = could not find container \"c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967\": container with ID starting with c144b596759e9a73bb7c5c9895a04dc509cc3807145798c8066cdaf93eb10967 not found: ID does not exist" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.529083 4772 scope.go:117] "RemoveContainer" containerID="1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b" Sep 30 18:30:18 crc kubenswrapper[4772]: E0930 18:30:18.530311 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b\": container with ID starting with 1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b not found: ID does not exist" containerID="1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.530367 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b"} err="failed to get container status \"1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b\": rpc error: code = NotFound desc = could not find container \"1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b\": container with ID starting with 1f5bdc7d899a66a9455855934902027e0b926491e65fb03120064255ad70727b not found: ID does not exist" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.530405 4772 scope.go:117] "RemoveContainer" containerID="9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876" Sep 30 18:30:18 crc kubenswrapper[4772]: E0930 18:30:18.531129 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876\": container with ID starting with 9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876 not found: ID does not exist" containerID="9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876" Sep 30 18:30:18 crc kubenswrapper[4772]: I0930 18:30:18.531158 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876"} err="failed to get container status \"9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876\": rpc error: code = NotFound desc = could not find container \"9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876\": container with ID starting with 9b4bcffe2bb39f2b7a97c4aeb4ee3d6a903ed0fb858fdb75b9907cd3c7247876 not found: ID does not exist" Sep 30 18:30:19 crc kubenswrapper[4772]: I0930 18:30:19.919918 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bf2eb75-7120-499a-b055-790886fec2ae" path="/var/lib/kubelet/pods/4bf2eb75-7120-499a-b055-790886fec2ae/volumes" Sep 30 18:30:22 crc kubenswrapper[4772]: I0930 18:30:22.900614 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:30:22 crc kubenswrapper[4772]: E0930 18:30:22.904565 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:30:35 crc kubenswrapper[4772]: I0930 18:30:35.898912 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:30:35 crc kubenswrapper[4772]: E0930 18:30:35.900211 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:30:47 crc kubenswrapper[4772]: I0930 18:30:47.033166 4772 scope.go:117] "RemoveContainer" containerID="edce61bdabf2cb239d68e624ddc29d5d72fa46c48888ab5629b305f935b27c04" Sep 30 18:30:48 crc kubenswrapper[4772]: I0930 18:30:48.899198 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:30:48 crc kubenswrapper[4772]: E0930 18:30:48.900735 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:31:00 crc kubenswrapper[4772]: I0930 18:31:00.899578 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:31:00 crc kubenswrapper[4772]: E0930 18:31:00.900623 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:31:13 crc kubenswrapper[4772]: I0930 18:31:13.898825 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:31:13 crc kubenswrapper[4772]: E0930 18:31:13.899634 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:31:26 crc kubenswrapper[4772]: I0930 18:31:26.898680 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:31:26 crc kubenswrapper[4772]: E0930 18:31:26.900514 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:31:37 crc kubenswrapper[4772]: I0930 18:31:37.898516 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:31:37 crc kubenswrapper[4772]: E0930 18:31:37.899510 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:31:51 crc kubenswrapper[4772]: I0930 18:31:51.898730 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:31:51 crc kubenswrapper[4772]: E0930 18:31:51.900413 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:32:04 crc kubenswrapper[4772]: I0930 18:32:04.899331 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:32:04 crc kubenswrapper[4772]: E0930 18:32:04.903831 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:32:19 crc kubenswrapper[4772]: I0930 18:32:19.906396 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:32:19 crc kubenswrapper[4772]: E0930 18:32:19.907670 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.469310 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t74mq"] Sep 30 18:32:27 crc kubenswrapper[4772]: E0930 18:32:27.470654 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bf2eb75-7120-499a-b055-790886fec2ae" containerName="extract-content" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.470672 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf2eb75-7120-499a-b055-790886fec2ae" containerName="extract-content" Sep 30 18:32:27 crc kubenswrapper[4772]: E0930 18:32:27.470694 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bf2eb75-7120-499a-b055-790886fec2ae" containerName="extract-utilities" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.470703 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf2eb75-7120-499a-b055-790886fec2ae" containerName="extract-utilities" Sep 30 18:32:27 crc kubenswrapper[4772]: E0930 18:32:27.470743 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bf2eb75-7120-499a-b055-790886fec2ae" containerName="registry-server" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.470753 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf2eb75-7120-499a-b055-790886fec2ae" containerName="registry-server" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.471011 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bf2eb75-7120-499a-b055-790886fec2ae" containerName="registry-server" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.472964 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.479592 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t74mq"] Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.639875 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-catalog-content\") pod \"redhat-marketplace-t74mq\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.640368 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qccs\" (UniqueName: \"kubernetes.io/projected/2fc42f49-e302-444b-864b-5f410b1d1433-kube-api-access-2qccs\") pod \"redhat-marketplace-t74mq\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.641186 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-utilities\") pod \"redhat-marketplace-t74mq\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.743791 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-utilities\") pod \"redhat-marketplace-t74mq\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.744050 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-catalog-content\") pod \"redhat-marketplace-t74mq\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.744270 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qccs\" (UniqueName: \"kubernetes.io/projected/2fc42f49-e302-444b-864b-5f410b1d1433-kube-api-access-2qccs\") pod \"redhat-marketplace-t74mq\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.744532 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-utilities\") pod \"redhat-marketplace-t74mq\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.744545 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-catalog-content\") pod \"redhat-marketplace-t74mq\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.773124 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qccs\" (UniqueName: \"kubernetes.io/projected/2fc42f49-e302-444b-864b-5f410b1d1433-kube-api-access-2qccs\") pod \"redhat-marketplace-t74mq\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:27 crc kubenswrapper[4772]: I0930 18:32:27.807362 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:28 crc kubenswrapper[4772]: I0930 18:32:28.331731 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t74mq"] Sep 30 18:32:28 crc kubenswrapper[4772]: I0930 18:32:28.957045 4772 generic.go:334] "Generic (PLEG): container finished" podID="2fc42f49-e302-444b-864b-5f410b1d1433" containerID="89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12" exitCode=0 Sep 30 18:32:28 crc kubenswrapper[4772]: I0930 18:32:28.957305 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t74mq" event={"ID":"2fc42f49-e302-444b-864b-5f410b1d1433","Type":"ContainerDied","Data":"89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12"} Sep 30 18:32:28 crc kubenswrapper[4772]: I0930 18:32:28.960500 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t74mq" event={"ID":"2fc42f49-e302-444b-864b-5f410b1d1433","Type":"ContainerStarted","Data":"b2a458b1f773e72b65cf1a27a808369dea424f22497c5a702e106f9a88bd908f"} Sep 30 18:32:30 crc kubenswrapper[4772]: I0930 18:32:30.995730 4772 generic.go:334] "Generic (PLEG): container finished" podID="2fc42f49-e302-444b-864b-5f410b1d1433" containerID="ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d" exitCode=0 Sep 30 18:32:30 crc kubenswrapper[4772]: I0930 18:32:30.995804 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t74mq" event={"ID":"2fc42f49-e302-444b-864b-5f410b1d1433","Type":"ContainerDied","Data":"ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d"} Sep 30 18:32:32 crc kubenswrapper[4772]: I0930 18:32:32.019850 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t74mq" event={"ID":"2fc42f49-e302-444b-864b-5f410b1d1433","Type":"ContainerStarted","Data":"ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6"} Sep 30 18:32:32 crc kubenswrapper[4772]: I0930 18:32:32.052184 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t74mq" podStartSLOduration=2.568916979 podStartE2EDuration="5.052154068s" podCreationTimestamp="2025-09-30 18:32:27 +0000 UTC" firstStartedPulling="2025-09-30 18:32:28.958837293 +0000 UTC m=+5449.865850114" lastFinishedPulling="2025-09-30 18:32:31.442074372 +0000 UTC m=+5452.349087203" observedRunningTime="2025-09-30 18:32:32.049769086 +0000 UTC m=+5452.956781937" watchObservedRunningTime="2025-09-30 18:32:32.052154068 +0000 UTC m=+5452.959166899" Sep 30 18:32:34 crc kubenswrapper[4772]: I0930 18:32:34.899109 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:32:34 crc kubenswrapper[4772]: E0930 18:32:34.899632 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:32:37 crc kubenswrapper[4772]: I0930 18:32:37.808817 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:37 crc kubenswrapper[4772]: I0930 18:32:37.809833 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:38 crc kubenswrapper[4772]: I0930 18:32:38.266259 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:39 crc kubenswrapper[4772]: I0930 18:32:39.166962 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:39 crc kubenswrapper[4772]: I0930 18:32:39.225007 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t74mq"] Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.130038 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t74mq" podUID="2fc42f49-e302-444b-864b-5f410b1d1433" containerName="registry-server" containerID="cri-o://ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6" gracePeriod=2 Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.646476 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.810210 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-catalog-content\") pod \"2fc42f49-e302-444b-864b-5f410b1d1433\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.810335 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-utilities\") pod \"2fc42f49-e302-444b-864b-5f410b1d1433\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.810673 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qccs\" (UniqueName: \"kubernetes.io/projected/2fc42f49-e302-444b-864b-5f410b1d1433-kube-api-access-2qccs\") pod \"2fc42f49-e302-444b-864b-5f410b1d1433\" (UID: \"2fc42f49-e302-444b-864b-5f410b1d1433\") " Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.811724 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-utilities" (OuterVolumeSpecName: "utilities") pod "2fc42f49-e302-444b-864b-5f410b1d1433" (UID: "2fc42f49-e302-444b-864b-5f410b1d1433"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.824185 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fc42f49-e302-444b-864b-5f410b1d1433-kube-api-access-2qccs" (OuterVolumeSpecName: "kube-api-access-2qccs") pod "2fc42f49-e302-444b-864b-5f410b1d1433" (UID: "2fc42f49-e302-444b-864b-5f410b1d1433"). InnerVolumeSpecName "kube-api-access-2qccs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.826117 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fc42f49-e302-444b-864b-5f410b1d1433" (UID: "2fc42f49-e302-444b-864b-5f410b1d1433"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.913896 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.913936 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc42f49-e302-444b-864b-5f410b1d1433-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:41 crc kubenswrapper[4772]: I0930 18:32:41.913952 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qccs\" (UniqueName: \"kubernetes.io/projected/2fc42f49-e302-444b-864b-5f410b1d1433-kube-api-access-2qccs\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.146486 4772 generic.go:334] "Generic (PLEG): container finished" podID="2fc42f49-e302-444b-864b-5f410b1d1433" containerID="ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6" exitCode=0 Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.146558 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t74mq" Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.146552 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t74mq" event={"ID":"2fc42f49-e302-444b-864b-5f410b1d1433","Type":"ContainerDied","Data":"ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6"} Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.146652 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t74mq" event={"ID":"2fc42f49-e302-444b-864b-5f410b1d1433","Type":"ContainerDied","Data":"b2a458b1f773e72b65cf1a27a808369dea424f22497c5a702e106f9a88bd908f"} Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.146672 4772 scope.go:117] "RemoveContainer" containerID="ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6" Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.183581 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t74mq"] Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.192467 4772 scope.go:117] "RemoveContainer" containerID="ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d" Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.196744 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t74mq"] Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.238182 4772 scope.go:117] "RemoveContainer" containerID="89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12" Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.283839 4772 scope.go:117] "RemoveContainer" containerID="ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6" Sep 30 18:32:42 crc kubenswrapper[4772]: E0930 18:32:42.284536 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6\": container with ID starting with ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6 not found: ID does not exist" containerID="ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6" Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.285652 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6"} err="failed to get container status \"ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6\": rpc error: code = NotFound desc = could not find container \"ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6\": container with ID starting with ae84a7766c54e8159bc08e73800cbb2f3edaef69a8be734a8f4c3c2181f8caf6 not found: ID does not exist" Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.285675 4772 scope.go:117] "RemoveContainer" containerID="ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d" Sep 30 18:32:42 crc kubenswrapper[4772]: E0930 18:32:42.286267 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d\": container with ID starting with ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d not found: ID does not exist" containerID="ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d" Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.286325 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d"} err="failed to get container status \"ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d\": rpc error: code = NotFound desc = could not find container \"ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d\": container with ID starting with ee132233fd25eb48ee6615bce4b4ffb7f0f94b9ab020d83dacfbbcea8be44a8d not found: ID does not exist" Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.286367 4772 scope.go:117] "RemoveContainer" containerID="89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12" Sep 30 18:32:42 crc kubenswrapper[4772]: E0930 18:32:42.287249 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12\": container with ID starting with 89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12 not found: ID does not exist" containerID="89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12" Sep 30 18:32:42 crc kubenswrapper[4772]: I0930 18:32:42.287294 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12"} err="failed to get container status \"89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12\": rpc error: code = NotFound desc = could not find container \"89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12\": container with ID starting with 89e55e9de114c59cbb2addc1517c0c705d58d7467835e0e818ec94aff8c8fb12 not found: ID does not exist" Sep 30 18:32:43 crc kubenswrapper[4772]: I0930 18:32:43.918880 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fc42f49-e302-444b-864b-5f410b1d1433" path="/var/lib/kubelet/pods/2fc42f49-e302-444b-864b-5f410b1d1433/volumes" Sep 30 18:32:47 crc kubenswrapper[4772]: I0930 18:32:47.899183 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:32:48 crc kubenswrapper[4772]: I0930 18:32:48.219720 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"e508c193ddc1c82e1797483df26b5df030cb50aa471b03caa62b6359e66ca33d"} Sep 30 18:33:10 crc kubenswrapper[4772]: E0930 18:33:10.041684 4772 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.115:60658->38.102.83.115:35633: write tcp 38.102.83.115:60658->38.102.83.115:35633: write: broken pipe Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.746991 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qfcnn"] Sep 30 18:33:10 crc kubenswrapper[4772]: E0930 18:33:10.747416 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc42f49-e302-444b-864b-5f410b1d1433" containerName="extract-content" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.747429 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc42f49-e302-444b-864b-5f410b1d1433" containerName="extract-content" Sep 30 18:33:10 crc kubenswrapper[4772]: E0930 18:33:10.747450 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc42f49-e302-444b-864b-5f410b1d1433" containerName="extract-utilities" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.747457 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc42f49-e302-444b-864b-5f410b1d1433" containerName="extract-utilities" Sep 30 18:33:10 crc kubenswrapper[4772]: E0930 18:33:10.747483 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc42f49-e302-444b-864b-5f410b1d1433" containerName="registry-server" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.747489 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc42f49-e302-444b-864b-5f410b1d1433" containerName="registry-server" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.747686 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc42f49-e302-444b-864b-5f410b1d1433" containerName="registry-server" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.749095 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qfcnn"] Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.749171 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.830896 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dn2s\" (UniqueName: \"kubernetes.io/projected/5c00d350-8176-4d70-9d8c-14cb42d8e543-kube-api-access-2dn2s\") pod \"redhat-operators-qfcnn\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.831004 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-catalog-content\") pod \"redhat-operators-qfcnn\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.831195 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-utilities\") pod \"redhat-operators-qfcnn\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.933857 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dn2s\" (UniqueName: \"kubernetes.io/projected/5c00d350-8176-4d70-9d8c-14cb42d8e543-kube-api-access-2dn2s\") pod \"redhat-operators-qfcnn\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.933954 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-catalog-content\") pod \"redhat-operators-qfcnn\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.934045 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-utilities\") pod \"redhat-operators-qfcnn\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.934589 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-utilities\") pod \"redhat-operators-qfcnn\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.934694 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-catalog-content\") pod \"redhat-operators-qfcnn\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:10 crc kubenswrapper[4772]: I0930 18:33:10.962183 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dn2s\" (UniqueName: \"kubernetes.io/projected/5c00d350-8176-4d70-9d8c-14cb42d8e543-kube-api-access-2dn2s\") pod \"redhat-operators-qfcnn\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:11 crc kubenswrapper[4772]: I0930 18:33:11.123382 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:11 crc kubenswrapper[4772]: I0930 18:33:11.700552 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qfcnn"] Sep 30 18:33:12 crc kubenswrapper[4772]: I0930 18:33:12.510697 4772 generic.go:334] "Generic (PLEG): container finished" podID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerID="f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b" exitCode=0 Sep 30 18:33:12 crc kubenswrapper[4772]: I0930 18:33:12.510762 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfcnn" event={"ID":"5c00d350-8176-4d70-9d8c-14cb42d8e543","Type":"ContainerDied","Data":"f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b"} Sep 30 18:33:12 crc kubenswrapper[4772]: I0930 18:33:12.511609 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfcnn" event={"ID":"5c00d350-8176-4d70-9d8c-14cb42d8e543","Type":"ContainerStarted","Data":"d7999b8cf91ed59ab7deb8ca96ba3162b5ad3a183aeeded3683bd55824659ba2"} Sep 30 18:33:14 crc kubenswrapper[4772]: I0930 18:33:14.537685 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfcnn" event={"ID":"5c00d350-8176-4d70-9d8c-14cb42d8e543","Type":"ContainerStarted","Data":"cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c"} Sep 30 18:33:16 crc kubenswrapper[4772]: I0930 18:33:16.568551 4772 generic.go:334] "Generic (PLEG): container finished" podID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerID="cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c" exitCode=0 Sep 30 18:33:16 crc kubenswrapper[4772]: I0930 18:33:16.568664 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfcnn" event={"ID":"5c00d350-8176-4d70-9d8c-14cb42d8e543","Type":"ContainerDied","Data":"cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c"} Sep 30 18:33:17 crc kubenswrapper[4772]: I0930 18:33:17.582165 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfcnn" event={"ID":"5c00d350-8176-4d70-9d8c-14cb42d8e543","Type":"ContainerStarted","Data":"04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195"} Sep 30 18:33:17 crc kubenswrapper[4772]: I0930 18:33:17.616914 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qfcnn" podStartSLOduration=3.130952681 podStartE2EDuration="7.616886008s" podCreationTimestamp="2025-09-30 18:33:10 +0000 UTC" firstStartedPulling="2025-09-30 18:33:12.513873123 +0000 UTC m=+5493.420885974" lastFinishedPulling="2025-09-30 18:33:16.99980646 +0000 UTC m=+5497.906819301" observedRunningTime="2025-09-30 18:33:17.604621329 +0000 UTC m=+5498.511634170" watchObservedRunningTime="2025-09-30 18:33:17.616886008 +0000 UTC m=+5498.523898839" Sep 30 18:33:21 crc kubenswrapper[4772]: I0930 18:33:21.124343 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:21 crc kubenswrapper[4772]: I0930 18:33:21.126193 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:22 crc kubenswrapper[4772]: I0930 18:33:22.186995 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qfcnn" podUID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerName="registry-server" probeResult="failure" output=< Sep 30 18:33:22 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Sep 30 18:33:22 crc kubenswrapper[4772]: > Sep 30 18:33:31 crc kubenswrapper[4772]: I0930 18:33:31.199459 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:31 crc kubenswrapper[4772]: I0930 18:33:31.274547 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:31 crc kubenswrapper[4772]: I0930 18:33:31.453221 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qfcnn"] Sep 30 18:33:32 crc kubenswrapper[4772]: I0930 18:33:32.749022 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qfcnn" podUID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerName="registry-server" containerID="cri-o://04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195" gracePeriod=2 Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.350878 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.530144 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-catalog-content\") pod \"5c00d350-8176-4d70-9d8c-14cb42d8e543\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.531122 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-utilities\") pod \"5c00d350-8176-4d70-9d8c-14cb42d8e543\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.531250 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dn2s\" (UniqueName: \"kubernetes.io/projected/5c00d350-8176-4d70-9d8c-14cb42d8e543-kube-api-access-2dn2s\") pod \"5c00d350-8176-4d70-9d8c-14cb42d8e543\" (UID: \"5c00d350-8176-4d70-9d8c-14cb42d8e543\") " Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.531835 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-utilities" (OuterVolumeSpecName: "utilities") pod "5c00d350-8176-4d70-9d8c-14cb42d8e543" (UID: "5c00d350-8176-4d70-9d8c-14cb42d8e543"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.532527 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.541931 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c00d350-8176-4d70-9d8c-14cb42d8e543-kube-api-access-2dn2s" (OuterVolumeSpecName: "kube-api-access-2dn2s") pod "5c00d350-8176-4d70-9d8c-14cb42d8e543" (UID: "5c00d350-8176-4d70-9d8c-14cb42d8e543"). InnerVolumeSpecName "kube-api-access-2dn2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.622941 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c00d350-8176-4d70-9d8c-14cb42d8e543" (UID: "5c00d350-8176-4d70-9d8c-14cb42d8e543"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.635129 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c00d350-8176-4d70-9d8c-14cb42d8e543-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.635408 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dn2s\" (UniqueName: \"kubernetes.io/projected/5c00d350-8176-4d70-9d8c-14cb42d8e543-kube-api-access-2dn2s\") on node \"crc\" DevicePath \"\"" Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.763349 4772 generic.go:334] "Generic (PLEG): container finished" podID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerID="04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195" exitCode=0 Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.763390 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfcnn" event={"ID":"5c00d350-8176-4d70-9d8c-14cb42d8e543","Type":"ContainerDied","Data":"04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195"} Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.763458 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfcnn" event={"ID":"5c00d350-8176-4d70-9d8c-14cb42d8e543","Type":"ContainerDied","Data":"d7999b8cf91ed59ab7deb8ca96ba3162b5ad3a183aeeded3683bd55824659ba2"} Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.763472 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qfcnn" Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.763484 4772 scope.go:117] "RemoveContainer" containerID="04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195" Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.784143 4772 scope.go:117] "RemoveContainer" containerID="cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c" Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.805317 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qfcnn"] Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.813712 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qfcnn"] Sep 30 18:33:33 crc kubenswrapper[4772]: I0930 18:33:33.915712 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c00d350-8176-4d70-9d8c-14cb42d8e543" path="/var/lib/kubelet/pods/5c00d350-8176-4d70-9d8c-14cb42d8e543/volumes" Sep 30 18:33:34 crc kubenswrapper[4772]: I0930 18:33:34.252020 4772 scope.go:117] "RemoveContainer" containerID="f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b" Sep 30 18:33:34 crc kubenswrapper[4772]: I0930 18:33:34.325785 4772 scope.go:117] "RemoveContainer" containerID="04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195" Sep 30 18:33:34 crc kubenswrapper[4772]: E0930 18:33:34.326770 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195\": container with ID starting with 04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195 not found: ID does not exist" containerID="04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195" Sep 30 18:33:34 crc kubenswrapper[4772]: I0930 18:33:34.326820 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195"} err="failed to get container status \"04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195\": rpc error: code = NotFound desc = could not find container \"04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195\": container with ID starting with 04b92abfb6a8333cd30767443d82617e072b3b675d17bf889c87dcb4a7494195 not found: ID does not exist" Sep 30 18:33:34 crc kubenswrapper[4772]: I0930 18:33:34.326858 4772 scope.go:117] "RemoveContainer" containerID="cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c" Sep 30 18:33:34 crc kubenswrapper[4772]: E0930 18:33:34.327653 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c\": container with ID starting with cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c not found: ID does not exist" containerID="cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c" Sep 30 18:33:34 crc kubenswrapper[4772]: I0930 18:33:34.327736 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c"} err="failed to get container status \"cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c\": rpc error: code = NotFound desc = could not find container \"cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c\": container with ID starting with cc39ea46026087aaad65e89c625616bd2d4b6a02d58c9d43ce977eb8e228297c not found: ID does not exist" Sep 30 18:33:34 crc kubenswrapper[4772]: I0930 18:33:34.327795 4772 scope.go:117] "RemoveContainer" containerID="f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b" Sep 30 18:33:34 crc kubenswrapper[4772]: E0930 18:33:34.328317 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b\": container with ID starting with f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b not found: ID does not exist" containerID="f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b" Sep 30 18:33:34 crc kubenswrapper[4772]: I0930 18:33:34.328351 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b"} err="failed to get container status \"f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b\": rpc error: code = NotFound desc = could not find container \"f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b\": container with ID starting with f99a274dd038e8d2777ba275de9dfb69d0714b42d2bfcb0080edfa5fe514fb7b not found: ID does not exist" Sep 30 18:35:08 crc kubenswrapper[4772]: I0930 18:35:08.656270 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:35:08 crc kubenswrapper[4772]: I0930 18:35:08.657505 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:35:38 crc kubenswrapper[4772]: I0930 18:35:38.655026 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:35:38 crc kubenswrapper[4772]: I0930 18:35:38.655659 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:36:08 crc kubenswrapper[4772]: I0930 18:36:08.655548 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:36:08 crc kubenswrapper[4772]: I0930 18:36:08.656530 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:36:08 crc kubenswrapper[4772]: I0930 18:36:08.656600 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 18:36:08 crc kubenswrapper[4772]: I0930 18:36:08.657822 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e508c193ddc1c82e1797483df26b5df030cb50aa471b03caa62b6359e66ca33d"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:36:08 crc kubenswrapper[4772]: I0930 18:36:08.657892 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://e508c193ddc1c82e1797483df26b5df030cb50aa471b03caa62b6359e66ca33d" gracePeriod=600 Sep 30 18:36:09 crc kubenswrapper[4772]: I0930 18:36:09.663460 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="e508c193ddc1c82e1797483df26b5df030cb50aa471b03caa62b6359e66ca33d" exitCode=0 Sep 30 18:36:09 crc kubenswrapper[4772]: I0930 18:36:09.663531 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"e508c193ddc1c82e1797483df26b5df030cb50aa471b03caa62b6359e66ca33d"} Sep 30 18:36:09 crc kubenswrapper[4772]: I0930 18:36:09.664335 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f"} Sep 30 18:36:09 crc kubenswrapper[4772]: I0930 18:36:09.664365 4772 scope.go:117] "RemoveContainer" containerID="41dcb32d04e4c339ac606cb0ec7a1d7d101b296403ba8e03318883611976675d" Sep 30 18:38:38 crc kubenswrapper[4772]: I0930 18:38:38.655471 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:38:38 crc kubenswrapper[4772]: I0930 18:38:38.656555 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:39:08 crc kubenswrapper[4772]: I0930 18:39:08.655413 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:39:08 crc kubenswrapper[4772]: I0930 18:39:08.656213 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:39:38 crc kubenswrapper[4772]: I0930 18:39:38.656020 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:39:38 crc kubenswrapper[4772]: I0930 18:39:38.656826 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:39:38 crc kubenswrapper[4772]: I0930 18:39:38.656899 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 18:39:38 crc kubenswrapper[4772]: I0930 18:39:38.658458 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:39:38 crc kubenswrapper[4772]: I0930 18:39:38.658571 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" gracePeriod=600 Sep 30 18:39:38 crc kubenswrapper[4772]: E0930 18:39:38.782155 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:39:39 crc kubenswrapper[4772]: I0930 18:39:39.008669 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" exitCode=0 Sep 30 18:39:39 crc kubenswrapper[4772]: I0930 18:39:39.008730 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f"} Sep 30 18:39:39 crc kubenswrapper[4772]: I0930 18:39:39.008775 4772 scope.go:117] "RemoveContainer" containerID="e508c193ddc1c82e1797483df26b5df030cb50aa471b03caa62b6359e66ca33d" Sep 30 18:39:39 crc kubenswrapper[4772]: I0930 18:39:39.009885 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:39:39 crc kubenswrapper[4772]: E0930 18:39:39.010260 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:39:53 crc kubenswrapper[4772]: I0930 18:39:53.897987 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:39:53 crc kubenswrapper[4772]: E0930 18:39:53.898692 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:40:06 crc kubenswrapper[4772]: I0930 18:40:06.899382 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:40:06 crc kubenswrapper[4772]: E0930 18:40:06.900852 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:40:17 crc kubenswrapper[4772]: I0930 18:40:17.898979 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:40:17 crc kubenswrapper[4772]: E0930 18:40:17.899913 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:40:32 crc kubenswrapper[4772]: I0930 18:40:32.898476 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:40:32 crc kubenswrapper[4772]: E0930 18:40:32.899301 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:40:44 crc kubenswrapper[4772]: I0930 18:40:44.899036 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:40:44 crc kubenswrapper[4772]: E0930 18:40:44.900588 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:40:56 crc kubenswrapper[4772]: I0930 18:40:56.898938 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:40:56 crc kubenswrapper[4772]: E0930 18:40:56.900325 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.666660 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7rlkj"] Sep 30 18:41:00 crc kubenswrapper[4772]: E0930 18:41:00.667832 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerName="registry-server" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.667848 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerName="registry-server" Sep 30 18:41:00 crc kubenswrapper[4772]: E0930 18:41:00.667879 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerName="extract-content" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.667884 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerName="extract-content" Sep 30 18:41:00 crc kubenswrapper[4772]: E0930 18:41:00.667909 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerName="extract-utilities" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.667916 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerName="extract-utilities" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.668142 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c00d350-8176-4d70-9d8c-14cb42d8e543" containerName="registry-server" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.669719 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.711430 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7rlkj"] Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.739481 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-catalog-content\") pod \"certified-operators-7rlkj\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.739575 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbrfd\" (UniqueName: \"kubernetes.io/projected/5b597a59-efcb-45e1-babe-9c46e2fb12c2-kube-api-access-wbrfd\") pod \"certified-operators-7rlkj\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.739598 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-utilities\") pod \"certified-operators-7rlkj\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.841875 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-catalog-content\") pod \"certified-operators-7rlkj\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.842104 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbrfd\" (UniqueName: \"kubernetes.io/projected/5b597a59-efcb-45e1-babe-9c46e2fb12c2-kube-api-access-wbrfd\") pod \"certified-operators-7rlkj\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.842132 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-utilities\") pod \"certified-operators-7rlkj\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.842494 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-catalog-content\") pod \"certified-operators-7rlkj\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.842700 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-utilities\") pod \"certified-operators-7rlkj\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:00 crc kubenswrapper[4772]: I0930 18:41:00.863912 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbrfd\" (UniqueName: \"kubernetes.io/projected/5b597a59-efcb-45e1-babe-9c46e2fb12c2-kube-api-access-wbrfd\") pod \"certified-operators-7rlkj\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:01 crc kubenswrapper[4772]: I0930 18:41:01.004511 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:01 crc kubenswrapper[4772]: I0930 18:41:01.614535 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7rlkj"] Sep 30 18:41:01 crc kubenswrapper[4772]: I0930 18:41:01.903401 4772 generic.go:334] "Generic (PLEG): container finished" podID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerID="ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e" exitCode=0 Sep 30 18:41:01 crc kubenswrapper[4772]: I0930 18:41:01.909348 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:41:01 crc kubenswrapper[4772]: I0930 18:41:01.912469 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rlkj" event={"ID":"5b597a59-efcb-45e1-babe-9c46e2fb12c2","Type":"ContainerDied","Data":"ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e"} Sep 30 18:41:01 crc kubenswrapper[4772]: I0930 18:41:01.912554 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rlkj" event={"ID":"5b597a59-efcb-45e1-babe-9c46e2fb12c2","Type":"ContainerStarted","Data":"579ffe4c7c47187aaf4dbd17a0e2dd7bc440b4c206af00f09a11e116a2297c95"} Sep 30 18:41:03 crc kubenswrapper[4772]: I0930 18:41:03.926591 4772 generic.go:334] "Generic (PLEG): container finished" podID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerID="23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554" exitCode=0 Sep 30 18:41:03 crc kubenswrapper[4772]: I0930 18:41:03.926675 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rlkj" event={"ID":"5b597a59-efcb-45e1-babe-9c46e2fb12c2","Type":"ContainerDied","Data":"23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554"} Sep 30 18:41:04 crc kubenswrapper[4772]: I0930 18:41:04.941812 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rlkj" event={"ID":"5b597a59-efcb-45e1-babe-9c46e2fb12c2","Type":"ContainerStarted","Data":"022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8"} Sep 30 18:41:04 crc kubenswrapper[4772]: I0930 18:41:04.969902 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7rlkj" podStartSLOduration=2.462958076 podStartE2EDuration="4.969875497s" podCreationTimestamp="2025-09-30 18:41:00 +0000 UTC" firstStartedPulling="2025-09-30 18:41:01.909111791 +0000 UTC m=+5962.816124622" lastFinishedPulling="2025-09-30 18:41:04.416029212 +0000 UTC m=+5965.323042043" observedRunningTime="2025-09-30 18:41:04.962132147 +0000 UTC m=+5965.869144998" watchObservedRunningTime="2025-09-30 18:41:04.969875497 +0000 UTC m=+5965.876888328" Sep 30 18:41:11 crc kubenswrapper[4772]: I0930 18:41:11.005024 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:11 crc kubenswrapper[4772]: I0930 18:41:11.006429 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:11 crc kubenswrapper[4772]: I0930 18:41:11.052459 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:11 crc kubenswrapper[4772]: I0930 18:41:11.899307 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:41:11 crc kubenswrapper[4772]: E0930 18:41:11.899821 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:41:12 crc kubenswrapper[4772]: I0930 18:41:12.094333 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:12 crc kubenswrapper[4772]: I0930 18:41:12.160184 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7rlkj"] Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.054850 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7rlkj" podUID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerName="registry-server" containerID="cri-o://022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8" gracePeriod=2 Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.648137 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.794075 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbrfd\" (UniqueName: \"kubernetes.io/projected/5b597a59-efcb-45e1-babe-9c46e2fb12c2-kube-api-access-wbrfd\") pod \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.794397 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-utilities\") pod \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.794431 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-catalog-content\") pod \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\" (UID: \"5b597a59-efcb-45e1-babe-9c46e2fb12c2\") " Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.795326 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-utilities" (OuterVolumeSpecName: "utilities") pod "5b597a59-efcb-45e1-babe-9c46e2fb12c2" (UID: "5b597a59-efcb-45e1-babe-9c46e2fb12c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.804636 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b597a59-efcb-45e1-babe-9c46e2fb12c2-kube-api-access-wbrfd" (OuterVolumeSpecName: "kube-api-access-wbrfd") pod "5b597a59-efcb-45e1-babe-9c46e2fb12c2" (UID: "5b597a59-efcb-45e1-babe-9c46e2fb12c2"). InnerVolumeSpecName "kube-api-access-wbrfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.841507 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b597a59-efcb-45e1-babe-9c46e2fb12c2" (UID: "5b597a59-efcb-45e1-babe-9c46e2fb12c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.899816 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbrfd\" (UniqueName: \"kubernetes.io/projected/5b597a59-efcb-45e1-babe-9c46e2fb12c2-kube-api-access-wbrfd\") on node \"crc\" DevicePath \"\"" Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.899871 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:41:14 crc kubenswrapper[4772]: I0930 18:41:14.899884 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b597a59-efcb-45e1-babe-9c46e2fb12c2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.068022 4772 generic.go:334] "Generic (PLEG): container finished" podID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerID="022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8" exitCode=0 Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.068159 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rlkj" event={"ID":"5b597a59-efcb-45e1-babe-9c46e2fb12c2","Type":"ContainerDied","Data":"022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8"} Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.068236 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rlkj" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.068258 4772 scope.go:117] "RemoveContainer" containerID="022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.068240 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rlkj" event={"ID":"5b597a59-efcb-45e1-babe-9c46e2fb12c2","Type":"ContainerDied","Data":"579ffe4c7c47187aaf4dbd17a0e2dd7bc440b4c206af00f09a11e116a2297c95"} Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.092278 4772 scope.go:117] "RemoveContainer" containerID="23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.122009 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7rlkj"] Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.131588 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7rlkj"] Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.133880 4772 scope.go:117] "RemoveContainer" containerID="ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.194473 4772 scope.go:117] "RemoveContainer" containerID="022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8" Sep 30 18:41:15 crc kubenswrapper[4772]: E0930 18:41:15.195526 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8\": container with ID starting with 022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8 not found: ID does not exist" containerID="022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.195563 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8"} err="failed to get container status \"022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8\": rpc error: code = NotFound desc = could not find container \"022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8\": container with ID starting with 022f71fc1167c180dd5e0cb6dfd954107a40e8e551c2f18ead6403606169eac8 not found: ID does not exist" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.195590 4772 scope.go:117] "RemoveContainer" containerID="23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554" Sep 30 18:41:15 crc kubenswrapper[4772]: E0930 18:41:15.195999 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554\": container with ID starting with 23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554 not found: ID does not exist" containerID="23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.196042 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554"} err="failed to get container status \"23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554\": rpc error: code = NotFound desc = could not find container \"23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554\": container with ID starting with 23aa6884d1f6dbdb668acb300a93b3cb93657a95865d9d0da7caa2418a1b4554 not found: ID does not exist" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.196081 4772 scope.go:117] "RemoveContainer" containerID="ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e" Sep 30 18:41:15 crc kubenswrapper[4772]: E0930 18:41:15.196482 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e\": container with ID starting with ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e not found: ID does not exist" containerID="ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.196511 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e"} err="failed to get container status \"ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e\": rpc error: code = NotFound desc = could not find container \"ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e\": container with ID starting with ae1d81a1f384cd86d787109a2f5afb3b72d71f871a33f1fcbb0d9eeddb313f3e not found: ID does not exist" Sep 30 18:41:15 crc kubenswrapper[4772]: I0930 18:41:15.913715 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" path="/var/lib/kubelet/pods/5b597a59-efcb-45e1-babe-9c46e2fb12c2/volumes" Sep 30 18:41:25 crc kubenswrapper[4772]: I0930 18:41:25.898983 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:41:25 crc kubenswrapper[4772]: E0930 18:41:25.900720 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:41:40 crc kubenswrapper[4772]: I0930 18:41:40.899581 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:41:40 crc kubenswrapper[4772]: E0930 18:41:40.900805 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:41:55 crc kubenswrapper[4772]: I0930 18:41:55.902069 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:41:55 crc kubenswrapper[4772]: E0930 18:41:55.903109 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:42:06 crc kubenswrapper[4772]: I0930 18:42:06.899933 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:42:06 crc kubenswrapper[4772]: E0930 18:42:06.900946 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:42:17 crc kubenswrapper[4772]: I0930 18:42:17.904371 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:42:17 crc kubenswrapper[4772]: E0930 18:42:17.905749 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:42:29 crc kubenswrapper[4772]: I0930 18:42:29.908911 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:42:29 crc kubenswrapper[4772]: E0930 18:42:29.910453 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:42:43 crc kubenswrapper[4772]: I0930 18:42:43.899036 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:42:43 crc kubenswrapper[4772]: E0930 18:42:43.900673 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:42:56 crc kubenswrapper[4772]: I0930 18:42:56.898859 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:42:56 crc kubenswrapper[4772]: E0930 18:42:56.900147 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:43:05 crc kubenswrapper[4772]: I0930 18:43:05.882772 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d7p82"] Sep 30 18:43:05 crc kubenswrapper[4772]: E0930 18:43:05.884491 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerName="registry-server" Sep 30 18:43:05 crc kubenswrapper[4772]: I0930 18:43:05.884514 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerName="registry-server" Sep 30 18:43:05 crc kubenswrapper[4772]: E0930 18:43:05.884538 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerName="extract-utilities" Sep 30 18:43:05 crc kubenswrapper[4772]: I0930 18:43:05.884549 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerName="extract-utilities" Sep 30 18:43:05 crc kubenswrapper[4772]: E0930 18:43:05.884567 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerName="extract-content" Sep 30 18:43:05 crc kubenswrapper[4772]: I0930 18:43:05.884579 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerName="extract-content" Sep 30 18:43:05 crc kubenswrapper[4772]: I0930 18:43:05.884924 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b597a59-efcb-45e1-babe-9c46e2fb12c2" containerName="registry-server" Sep 30 18:43:05 crc kubenswrapper[4772]: I0930 18:43:05.887198 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:05 crc kubenswrapper[4772]: I0930 18:43:05.913395 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7p82"] Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.008253 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-utilities\") pod \"redhat-marketplace-d7p82\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.008375 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-catalog-content\") pod \"redhat-marketplace-d7p82\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.008463 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfq5n\" (UniqueName: \"kubernetes.io/projected/bdfff439-2c39-4a27-920a-150d447ae757-kube-api-access-hfq5n\") pod \"redhat-marketplace-d7p82\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.112199 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfq5n\" (UniqueName: \"kubernetes.io/projected/bdfff439-2c39-4a27-920a-150d447ae757-kube-api-access-hfq5n\") pod \"redhat-marketplace-d7p82\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.112768 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-utilities\") pod \"redhat-marketplace-d7p82\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.112878 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-catalog-content\") pod \"redhat-marketplace-d7p82\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.113577 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-utilities\") pod \"redhat-marketplace-d7p82\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.113609 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-catalog-content\") pod \"redhat-marketplace-d7p82\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.147775 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfq5n\" (UniqueName: \"kubernetes.io/projected/bdfff439-2c39-4a27-920a-150d447ae757-kube-api-access-hfq5n\") pod \"redhat-marketplace-d7p82\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.222413 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:06 crc kubenswrapper[4772]: I0930 18:43:06.717729 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7p82"] Sep 30 18:43:06 crc kubenswrapper[4772]: W0930 18:43:06.738489 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdfff439_2c39_4a27_920a_150d447ae757.slice/crio-48027299b3469ef60043098b3ea89f02f756b1beda78a5c42f3aeb2600cc2236 WatchSource:0}: Error finding container 48027299b3469ef60043098b3ea89f02f756b1beda78a5c42f3aeb2600cc2236: Status 404 returned error can't find the container with id 48027299b3469ef60043098b3ea89f02f756b1beda78a5c42f3aeb2600cc2236 Sep 30 18:43:07 crc kubenswrapper[4772]: I0930 18:43:07.424911 4772 generic.go:334] "Generic (PLEG): container finished" podID="bdfff439-2c39-4a27-920a-150d447ae757" containerID="efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77" exitCode=0 Sep 30 18:43:07 crc kubenswrapper[4772]: I0930 18:43:07.425028 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7p82" event={"ID":"bdfff439-2c39-4a27-920a-150d447ae757","Type":"ContainerDied","Data":"efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77"} Sep 30 18:43:07 crc kubenswrapper[4772]: I0930 18:43:07.425584 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7p82" event={"ID":"bdfff439-2c39-4a27-920a-150d447ae757","Type":"ContainerStarted","Data":"48027299b3469ef60043098b3ea89f02f756b1beda78a5c42f3aeb2600cc2236"} Sep 30 18:43:08 crc kubenswrapper[4772]: I0930 18:43:08.438321 4772 generic.go:334] "Generic (PLEG): container finished" podID="bdfff439-2c39-4a27-920a-150d447ae757" containerID="d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65" exitCode=0 Sep 30 18:43:08 crc kubenswrapper[4772]: I0930 18:43:08.438431 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7p82" event={"ID":"bdfff439-2c39-4a27-920a-150d447ae757","Type":"ContainerDied","Data":"d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65"} Sep 30 18:43:09 crc kubenswrapper[4772]: I0930 18:43:09.456381 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7p82" event={"ID":"bdfff439-2c39-4a27-920a-150d447ae757","Type":"ContainerStarted","Data":"d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016"} Sep 30 18:43:09 crc kubenswrapper[4772]: I0930 18:43:09.498953 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d7p82" podStartSLOduration=3.060331273 podStartE2EDuration="4.498927371s" podCreationTimestamp="2025-09-30 18:43:05 +0000 UTC" firstStartedPulling="2025-09-30 18:43:07.430243608 +0000 UTC m=+6088.337256479" lastFinishedPulling="2025-09-30 18:43:08.868839746 +0000 UTC m=+6089.775852577" observedRunningTime="2025-09-30 18:43:09.486899914 +0000 UTC m=+6090.393912755" watchObservedRunningTime="2025-09-30 18:43:09.498927371 +0000 UTC m=+6090.405940202" Sep 30 18:43:11 crc kubenswrapper[4772]: I0930 18:43:11.899118 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:43:11 crc kubenswrapper[4772]: E0930 18:43:11.902164 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:43:16 crc kubenswrapper[4772]: I0930 18:43:16.223174 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:16 crc kubenswrapper[4772]: I0930 18:43:16.223588 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:16 crc kubenswrapper[4772]: I0930 18:43:16.303862 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:16 crc kubenswrapper[4772]: I0930 18:43:16.587312 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:16 crc kubenswrapper[4772]: I0930 18:43:16.657970 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7p82"] Sep 30 18:43:18 crc kubenswrapper[4772]: I0930 18:43:18.568622 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d7p82" podUID="bdfff439-2c39-4a27-920a-150d447ae757" containerName="registry-server" containerID="cri-o://d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016" gracePeriod=2 Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.091693 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.199284 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfq5n\" (UniqueName: \"kubernetes.io/projected/bdfff439-2c39-4a27-920a-150d447ae757-kube-api-access-hfq5n\") pod \"bdfff439-2c39-4a27-920a-150d447ae757\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.199986 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-catalog-content\") pod \"bdfff439-2c39-4a27-920a-150d447ae757\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.200142 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-utilities\") pod \"bdfff439-2c39-4a27-920a-150d447ae757\" (UID: \"bdfff439-2c39-4a27-920a-150d447ae757\") " Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.200976 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-utilities" (OuterVolumeSpecName: "utilities") pod "bdfff439-2c39-4a27-920a-150d447ae757" (UID: "bdfff439-2c39-4a27-920a-150d447ae757"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.207621 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdfff439-2c39-4a27-920a-150d447ae757-kube-api-access-hfq5n" (OuterVolumeSpecName: "kube-api-access-hfq5n") pod "bdfff439-2c39-4a27-920a-150d447ae757" (UID: "bdfff439-2c39-4a27-920a-150d447ae757"). InnerVolumeSpecName "kube-api-access-hfq5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.215250 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdfff439-2c39-4a27-920a-150d447ae757" (UID: "bdfff439-2c39-4a27-920a-150d447ae757"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.303183 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.303231 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfff439-2c39-4a27-920a-150d447ae757-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.303247 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfq5n\" (UniqueName: \"kubernetes.io/projected/bdfff439-2c39-4a27-920a-150d447ae757-kube-api-access-hfq5n\") on node \"crc\" DevicePath \"\"" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.581564 4772 generic.go:334] "Generic (PLEG): container finished" podID="bdfff439-2c39-4a27-920a-150d447ae757" containerID="d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016" exitCode=0 Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.581627 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7p82" event={"ID":"bdfff439-2c39-4a27-920a-150d447ae757","Type":"ContainerDied","Data":"d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016"} Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.581660 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7p82" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.581678 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7p82" event={"ID":"bdfff439-2c39-4a27-920a-150d447ae757","Type":"ContainerDied","Data":"48027299b3469ef60043098b3ea89f02f756b1beda78a5c42f3aeb2600cc2236"} Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.581711 4772 scope.go:117] "RemoveContainer" containerID="d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.609990 4772 scope.go:117] "RemoveContainer" containerID="d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.631598 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7p82"] Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.642794 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7p82"] Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.643208 4772 scope.go:117] "RemoveContainer" containerID="efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.694685 4772 scope.go:117] "RemoveContainer" containerID="d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016" Sep 30 18:43:19 crc kubenswrapper[4772]: E0930 18:43:19.695108 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016\": container with ID starting with d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016 not found: ID does not exist" containerID="d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.695148 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016"} err="failed to get container status \"d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016\": rpc error: code = NotFound desc = could not find container \"d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016\": container with ID starting with d2fbf07be302bced605f319bf16f34f547142918f8052d546c28cd028bc68016 not found: ID does not exist" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.695180 4772 scope.go:117] "RemoveContainer" containerID="d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65" Sep 30 18:43:19 crc kubenswrapper[4772]: E0930 18:43:19.695954 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65\": container with ID starting with d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65 not found: ID does not exist" containerID="d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.695985 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65"} err="failed to get container status \"d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65\": rpc error: code = NotFound desc = could not find container \"d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65\": container with ID starting with d42bb14b2d79c25eb339720dcfb387509f029200e93198bd9eca1138bdb8bd65 not found: ID does not exist" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.696003 4772 scope.go:117] "RemoveContainer" containerID="efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77" Sep 30 18:43:19 crc kubenswrapper[4772]: E0930 18:43:19.696335 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77\": container with ID starting with efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77 not found: ID does not exist" containerID="efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.696367 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77"} err="failed to get container status \"efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77\": rpc error: code = NotFound desc = could not find container \"efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77\": container with ID starting with efcfcd07df0de7e3f48e5ccf709853558679e787c33f174bff580ff19d3bbc77 not found: ID does not exist" Sep 30 18:43:19 crc kubenswrapper[4772]: I0930 18:43:19.918793 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdfff439-2c39-4a27-920a-150d447ae757" path="/var/lib/kubelet/pods/bdfff439-2c39-4a27-920a-150d447ae757/volumes" Sep 30 18:43:21 crc kubenswrapper[4772]: I0930 18:43:21.971860 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9vspj"] Sep 30 18:43:21 crc kubenswrapper[4772]: E0930 18:43:21.972619 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfff439-2c39-4a27-920a-150d447ae757" containerName="extract-utilities" Sep 30 18:43:21 crc kubenswrapper[4772]: I0930 18:43:21.972633 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfff439-2c39-4a27-920a-150d447ae757" containerName="extract-utilities" Sep 30 18:43:21 crc kubenswrapper[4772]: E0930 18:43:21.972646 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfff439-2c39-4a27-920a-150d447ae757" containerName="extract-content" Sep 30 18:43:21 crc kubenswrapper[4772]: I0930 18:43:21.972652 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfff439-2c39-4a27-920a-150d447ae757" containerName="extract-content" Sep 30 18:43:21 crc kubenswrapper[4772]: E0930 18:43:21.972690 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfff439-2c39-4a27-920a-150d447ae757" containerName="registry-server" Sep 30 18:43:21 crc kubenswrapper[4772]: I0930 18:43:21.972696 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfff439-2c39-4a27-920a-150d447ae757" containerName="registry-server" Sep 30 18:43:21 crc kubenswrapper[4772]: I0930 18:43:21.972885 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdfff439-2c39-4a27-920a-150d447ae757" containerName="registry-server" Sep 30 18:43:21 crc kubenswrapper[4772]: I0930 18:43:21.974472 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:21 crc kubenswrapper[4772]: I0930 18:43:21.997326 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9vspj"] Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.098101 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-catalog-content\") pod \"redhat-operators-9vspj\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.098712 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgnrd\" (UniqueName: \"kubernetes.io/projected/03273a55-7f29-4a1e-a98e-ac3e9a23088a-kube-api-access-hgnrd\") pod \"redhat-operators-9vspj\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.098964 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-utilities\") pod \"redhat-operators-9vspj\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.201704 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgnrd\" (UniqueName: \"kubernetes.io/projected/03273a55-7f29-4a1e-a98e-ac3e9a23088a-kube-api-access-hgnrd\") pod \"redhat-operators-9vspj\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.201768 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-utilities\") pod \"redhat-operators-9vspj\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.201836 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-catalog-content\") pod \"redhat-operators-9vspj\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.202461 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-catalog-content\") pod \"redhat-operators-9vspj\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.202559 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-utilities\") pod \"redhat-operators-9vspj\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.226243 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgnrd\" (UniqueName: \"kubernetes.io/projected/03273a55-7f29-4a1e-a98e-ac3e9a23088a-kube-api-access-hgnrd\") pod \"redhat-operators-9vspj\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.295626 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:22 crc kubenswrapper[4772]: I0930 18:43:22.822536 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9vspj"] Sep 30 18:43:23 crc kubenswrapper[4772]: I0930 18:43:23.630124 4772 generic.go:334] "Generic (PLEG): container finished" podID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerID="34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82" exitCode=0 Sep 30 18:43:23 crc kubenswrapper[4772]: I0930 18:43:23.630268 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vspj" event={"ID":"03273a55-7f29-4a1e-a98e-ac3e9a23088a","Type":"ContainerDied","Data":"34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82"} Sep 30 18:43:23 crc kubenswrapper[4772]: I0930 18:43:23.630499 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vspj" event={"ID":"03273a55-7f29-4a1e-a98e-ac3e9a23088a","Type":"ContainerStarted","Data":"4096349d409dce622d89086f64b5323bddd6cf13a65d34fe7e7e7bb7dddb1baa"} Sep 30 18:43:25 crc kubenswrapper[4772]: I0930 18:43:25.657727 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vspj" event={"ID":"03273a55-7f29-4a1e-a98e-ac3e9a23088a","Type":"ContainerStarted","Data":"bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c"} Sep 30 18:43:25 crc kubenswrapper[4772]: I0930 18:43:25.899916 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:43:25 crc kubenswrapper[4772]: E0930 18:43:25.900273 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:43:26 crc kubenswrapper[4772]: I0930 18:43:26.674987 4772 generic.go:334] "Generic (PLEG): container finished" podID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerID="bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c" exitCode=0 Sep 30 18:43:26 crc kubenswrapper[4772]: I0930 18:43:26.675127 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vspj" event={"ID":"03273a55-7f29-4a1e-a98e-ac3e9a23088a","Type":"ContainerDied","Data":"bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c"} Sep 30 18:43:28 crc kubenswrapper[4772]: I0930 18:43:28.704096 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vspj" event={"ID":"03273a55-7f29-4a1e-a98e-ac3e9a23088a","Type":"ContainerStarted","Data":"eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6"} Sep 30 18:43:28 crc kubenswrapper[4772]: I0930 18:43:28.734097 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9vspj" podStartSLOduration=4.147537692 podStartE2EDuration="7.734080988s" podCreationTimestamp="2025-09-30 18:43:21 +0000 UTC" firstStartedPulling="2025-09-30 18:43:23.632422233 +0000 UTC m=+6104.539435064" lastFinishedPulling="2025-09-30 18:43:27.218965529 +0000 UTC m=+6108.125978360" observedRunningTime="2025-09-30 18:43:28.732639091 +0000 UTC m=+6109.639651922" watchObservedRunningTime="2025-09-30 18:43:28.734080988 +0000 UTC m=+6109.641093819" Sep 30 18:43:32 crc kubenswrapper[4772]: I0930 18:43:32.296317 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:32 crc kubenswrapper[4772]: I0930 18:43:32.296989 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:33 crc kubenswrapper[4772]: I0930 18:43:33.385025 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9vspj" podUID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerName="registry-server" probeResult="failure" output=< Sep 30 18:43:33 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Sep 30 18:43:33 crc kubenswrapper[4772]: > Sep 30 18:43:36 crc kubenswrapper[4772]: I0930 18:43:36.899449 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:43:36 crc kubenswrapper[4772]: E0930 18:43:36.900361 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:43:42 crc kubenswrapper[4772]: I0930 18:43:42.391129 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:42 crc kubenswrapper[4772]: I0930 18:43:42.455949 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:42 crc kubenswrapper[4772]: I0930 18:43:42.642325 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9vspj"] Sep 30 18:43:43 crc kubenswrapper[4772]: I0930 18:43:43.868580 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9vspj" podUID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerName="registry-server" containerID="cri-o://eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6" gracePeriod=2 Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.413788 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.484641 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-catalog-content\") pod \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.485046 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-utilities\") pod \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.485173 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgnrd\" (UniqueName: \"kubernetes.io/projected/03273a55-7f29-4a1e-a98e-ac3e9a23088a-kube-api-access-hgnrd\") pod \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\" (UID: \"03273a55-7f29-4a1e-a98e-ac3e9a23088a\") " Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.486046 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-utilities" (OuterVolumeSpecName: "utilities") pod "03273a55-7f29-4a1e-a98e-ac3e9a23088a" (UID: "03273a55-7f29-4a1e-a98e-ac3e9a23088a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.493767 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03273a55-7f29-4a1e-a98e-ac3e9a23088a-kube-api-access-hgnrd" (OuterVolumeSpecName: "kube-api-access-hgnrd") pod "03273a55-7f29-4a1e-a98e-ac3e9a23088a" (UID: "03273a55-7f29-4a1e-a98e-ac3e9a23088a"). InnerVolumeSpecName "kube-api-access-hgnrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.584252 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03273a55-7f29-4a1e-a98e-ac3e9a23088a" (UID: "03273a55-7f29-4a1e-a98e-ac3e9a23088a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.588671 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.588712 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgnrd\" (UniqueName: \"kubernetes.io/projected/03273a55-7f29-4a1e-a98e-ac3e9a23088a-kube-api-access-hgnrd\") on node \"crc\" DevicePath \"\"" Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.588725 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03273a55-7f29-4a1e-a98e-ac3e9a23088a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.884772 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vspj" event={"ID":"03273a55-7f29-4a1e-a98e-ac3e9a23088a","Type":"ContainerDied","Data":"eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6"} Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.884850 4772 scope.go:117] "RemoveContainer" containerID="eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6" Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.884792 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vspj" Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.884712 4772 generic.go:334] "Generic (PLEG): container finished" podID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerID="eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6" exitCode=0 Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.886367 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vspj" event={"ID":"03273a55-7f29-4a1e-a98e-ac3e9a23088a","Type":"ContainerDied","Data":"4096349d409dce622d89086f64b5323bddd6cf13a65d34fe7e7e7bb7dddb1baa"} Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.929723 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9vspj"] Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.934975 4772 scope.go:117] "RemoveContainer" containerID="bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c" Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.946231 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9vspj"] Sep 30 18:43:44 crc kubenswrapper[4772]: I0930 18:43:44.977968 4772 scope.go:117] "RemoveContainer" containerID="34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82" Sep 30 18:43:45 crc kubenswrapper[4772]: I0930 18:43:45.014843 4772 scope.go:117] "RemoveContainer" containerID="eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6" Sep 30 18:43:45 crc kubenswrapper[4772]: E0930 18:43:45.020625 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6\": container with ID starting with eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6 not found: ID does not exist" containerID="eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6" Sep 30 18:43:45 crc kubenswrapper[4772]: I0930 18:43:45.020684 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6"} err="failed to get container status \"eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6\": rpc error: code = NotFound desc = could not find container \"eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6\": container with ID starting with eb6843e08e7486d0b9f06695aa91e642ffef9d849ae6d522e80f0d91d5cb25c6 not found: ID does not exist" Sep 30 18:43:45 crc kubenswrapper[4772]: I0930 18:43:45.020716 4772 scope.go:117] "RemoveContainer" containerID="bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c" Sep 30 18:43:45 crc kubenswrapper[4772]: E0930 18:43:45.021470 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c\": container with ID starting with bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c not found: ID does not exist" containerID="bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c" Sep 30 18:43:45 crc kubenswrapper[4772]: I0930 18:43:45.021533 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c"} err="failed to get container status \"bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c\": rpc error: code = NotFound desc = could not find container \"bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c\": container with ID starting with bffb21e2138ccd78e4b866b274f38cca39549e87b126b8c24bb7fcde5149547c not found: ID does not exist" Sep 30 18:43:45 crc kubenswrapper[4772]: I0930 18:43:45.021576 4772 scope.go:117] "RemoveContainer" containerID="34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82" Sep 30 18:43:45 crc kubenswrapper[4772]: E0930 18:43:45.022162 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82\": container with ID starting with 34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82 not found: ID does not exist" containerID="34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82" Sep 30 18:43:45 crc kubenswrapper[4772]: I0930 18:43:45.022439 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82"} err="failed to get container status \"34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82\": rpc error: code = NotFound desc = could not find container \"34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82\": container with ID starting with 34b842b9f1b722f64c49bc961b058efb5c8d12ddd0e4d6ab0540d4aefac22c82 not found: ID does not exist" Sep 30 18:43:45 crc kubenswrapper[4772]: I0930 18:43:45.915587 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" path="/var/lib/kubelet/pods/03273a55-7f29-4a1e-a98e-ac3e9a23088a/volumes" Sep 30 18:43:47 crc kubenswrapper[4772]: I0930 18:43:47.898283 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:43:47 crc kubenswrapper[4772]: E0930 18:43:47.899129 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:44:02 crc kubenswrapper[4772]: I0930 18:44:02.898561 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:44:02 crc kubenswrapper[4772]: E0930 18:44:02.899689 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:44:16 crc kubenswrapper[4772]: I0930 18:44:16.899021 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:44:16 crc kubenswrapper[4772]: E0930 18:44:16.900264 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:44:29 crc kubenswrapper[4772]: I0930 18:44:29.911219 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:44:29 crc kubenswrapper[4772]: E0930 18:44:29.912344 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:44:41 crc kubenswrapper[4772]: I0930 18:44:41.900305 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:44:42 crc kubenswrapper[4772]: I0930 18:44:42.548649 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"1367294882a2d9c33499e14d2032a9440594dbec5994447bf3fc56ceb9196d06"} Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.177290 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778"] Sep 30 18:45:00 crc kubenswrapper[4772]: E0930 18:45:00.178430 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerName="extract-utilities" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.178450 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerName="extract-utilities" Sep 30 18:45:00 crc kubenswrapper[4772]: E0930 18:45:00.178484 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerName="registry-server" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.178494 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerName="registry-server" Sep 30 18:45:00 crc kubenswrapper[4772]: E0930 18:45:00.178511 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerName="extract-content" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.178519 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerName="extract-content" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.178837 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="03273a55-7f29-4a1e-a98e-ac3e9a23088a" containerName="registry-server" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.180950 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.183621 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.184332 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.187789 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778"] Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.298391 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/778ce6c9-6029-4c29-932e-80dcbe287ae9-config-volume\") pod \"collect-profiles-29320965-dv778\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.298508 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/778ce6c9-6029-4c29-932e-80dcbe287ae9-secret-volume\") pod \"collect-profiles-29320965-dv778\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.298533 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzs9z\" (UniqueName: \"kubernetes.io/projected/778ce6c9-6029-4c29-932e-80dcbe287ae9-kube-api-access-jzs9z\") pod \"collect-profiles-29320965-dv778\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.400648 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/778ce6c9-6029-4c29-932e-80dcbe287ae9-secret-volume\") pod \"collect-profiles-29320965-dv778\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.400696 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzs9z\" (UniqueName: \"kubernetes.io/projected/778ce6c9-6029-4c29-932e-80dcbe287ae9-kube-api-access-jzs9z\") pod \"collect-profiles-29320965-dv778\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.400811 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/778ce6c9-6029-4c29-932e-80dcbe287ae9-config-volume\") pod \"collect-profiles-29320965-dv778\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.402225 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/778ce6c9-6029-4c29-932e-80dcbe287ae9-config-volume\") pod \"collect-profiles-29320965-dv778\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.416249 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/778ce6c9-6029-4c29-932e-80dcbe287ae9-secret-volume\") pod \"collect-profiles-29320965-dv778\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.425697 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzs9z\" (UniqueName: \"kubernetes.io/projected/778ce6c9-6029-4c29-932e-80dcbe287ae9-kube-api-access-jzs9z\") pod \"collect-profiles-29320965-dv778\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:00 crc kubenswrapper[4772]: I0930 18:45:00.510440 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:01 crc kubenswrapper[4772]: I0930 18:45:01.003425 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778"] Sep 30 18:45:01 crc kubenswrapper[4772]: I0930 18:45:01.807716 4772 generic.go:334] "Generic (PLEG): container finished" podID="778ce6c9-6029-4c29-932e-80dcbe287ae9" containerID="59dcaee97ac2f1aa569ed57a5a8c1ee291a0b4c88873b91ba8276d1a4e3d622e" exitCode=0 Sep 30 18:45:01 crc kubenswrapper[4772]: I0930 18:45:01.807810 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" event={"ID":"778ce6c9-6029-4c29-932e-80dcbe287ae9","Type":"ContainerDied","Data":"59dcaee97ac2f1aa569ed57a5a8c1ee291a0b4c88873b91ba8276d1a4e3d622e"} Sep 30 18:45:01 crc kubenswrapper[4772]: I0930 18:45:01.808185 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" event={"ID":"778ce6c9-6029-4c29-932e-80dcbe287ae9","Type":"ContainerStarted","Data":"09c10b610b7ddf3490d8714a02a02ede31864f177df53cf87a7274c683d2139a"} Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.225194 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.273565 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/778ce6c9-6029-4c29-932e-80dcbe287ae9-secret-volume\") pod \"778ce6c9-6029-4c29-932e-80dcbe287ae9\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.273713 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/778ce6c9-6029-4c29-932e-80dcbe287ae9-config-volume\") pod \"778ce6c9-6029-4c29-932e-80dcbe287ae9\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.273955 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzs9z\" (UniqueName: \"kubernetes.io/projected/778ce6c9-6029-4c29-932e-80dcbe287ae9-kube-api-access-jzs9z\") pod \"778ce6c9-6029-4c29-932e-80dcbe287ae9\" (UID: \"778ce6c9-6029-4c29-932e-80dcbe287ae9\") " Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.274548 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/778ce6c9-6029-4c29-932e-80dcbe287ae9-config-volume" (OuterVolumeSpecName: "config-volume") pod "778ce6c9-6029-4c29-932e-80dcbe287ae9" (UID: "778ce6c9-6029-4c29-932e-80dcbe287ae9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.274905 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/778ce6c9-6029-4c29-932e-80dcbe287ae9-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.286474 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/778ce6c9-6029-4c29-932e-80dcbe287ae9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "778ce6c9-6029-4c29-932e-80dcbe287ae9" (UID: "778ce6c9-6029-4c29-932e-80dcbe287ae9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.291125 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/778ce6c9-6029-4c29-932e-80dcbe287ae9-kube-api-access-jzs9z" (OuterVolumeSpecName: "kube-api-access-jzs9z") pod "778ce6c9-6029-4c29-932e-80dcbe287ae9" (UID: "778ce6c9-6029-4c29-932e-80dcbe287ae9"). InnerVolumeSpecName "kube-api-access-jzs9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.377848 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/778ce6c9-6029-4c29-932e-80dcbe287ae9-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.377878 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzs9z\" (UniqueName: \"kubernetes.io/projected/778ce6c9-6029-4c29-932e-80dcbe287ae9-kube-api-access-jzs9z\") on node \"crc\" DevicePath \"\"" Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.831958 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" event={"ID":"778ce6c9-6029-4c29-932e-80dcbe287ae9","Type":"ContainerDied","Data":"09c10b610b7ddf3490d8714a02a02ede31864f177df53cf87a7274c683d2139a"} Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.832020 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c10b610b7ddf3490d8714a02a02ede31864f177df53cf87a7274c683d2139a" Sep 30 18:45:03 crc kubenswrapper[4772]: I0930 18:45:03.832096 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-dv778" Sep 30 18:45:04 crc kubenswrapper[4772]: I0930 18:45:04.317294 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft"] Sep 30 18:45:04 crc kubenswrapper[4772]: I0930 18:45:04.328682 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-b4fft"] Sep 30 18:45:05 crc kubenswrapper[4772]: I0930 18:45:05.913695 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c10a89ba-f623-46aa-88c6-a37a9bbf0052" path="/var/lib/kubelet/pods/c10a89ba-f623-46aa-88c6-a37a9bbf0052/volumes" Sep 30 18:45:47 crc kubenswrapper[4772]: I0930 18:45:47.563360 4772 scope.go:117] "RemoveContainer" containerID="e63e98edcdaeccbb469b4122dc158f45068de02fc54c6595b5a2f81b52b34543" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.187317 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vbmbw"] Sep 30 18:46:40 crc kubenswrapper[4772]: E0930 18:46:40.188907 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="778ce6c9-6029-4c29-932e-80dcbe287ae9" containerName="collect-profiles" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.188928 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="778ce6c9-6029-4c29-932e-80dcbe287ae9" containerName="collect-profiles" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.189124 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="778ce6c9-6029-4c29-932e-80dcbe287ae9" containerName="collect-profiles" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.190664 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.219090 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbmbw"] Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.255850 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-utilities\") pod \"community-operators-vbmbw\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.255934 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7ls\" (UniqueName: \"kubernetes.io/projected/630f1a25-f9d3-4d89-8836-f19a24c396f0-kube-api-access-qf7ls\") pod \"community-operators-vbmbw\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.256028 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-catalog-content\") pod \"community-operators-vbmbw\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.358014 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-utilities\") pod \"community-operators-vbmbw\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.359095 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7ls\" (UniqueName: \"kubernetes.io/projected/630f1a25-f9d3-4d89-8836-f19a24c396f0-kube-api-access-qf7ls\") pod \"community-operators-vbmbw\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.359297 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-catalog-content\") pod \"community-operators-vbmbw\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.359870 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-catalog-content\") pod \"community-operators-vbmbw\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.358646 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-utilities\") pod \"community-operators-vbmbw\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.389290 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7ls\" (UniqueName: \"kubernetes.io/projected/630f1a25-f9d3-4d89-8836-f19a24c396f0-kube-api-access-qf7ls\") pod \"community-operators-vbmbw\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.524163 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:40 crc kubenswrapper[4772]: I0930 18:46:40.963124 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbmbw"] Sep 30 18:46:41 crc kubenswrapper[4772]: I0930 18:46:41.018010 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbmbw" event={"ID":"630f1a25-f9d3-4d89-8836-f19a24c396f0","Type":"ContainerStarted","Data":"1e1b341a503265a8b2aae2a057f565384787c9a4ffcfebb582ed75c2649779d4"} Sep 30 18:46:42 crc kubenswrapper[4772]: I0930 18:46:42.032290 4772 generic.go:334] "Generic (PLEG): container finished" podID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerID="d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc" exitCode=0 Sep 30 18:46:42 crc kubenswrapper[4772]: I0930 18:46:42.032376 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbmbw" event={"ID":"630f1a25-f9d3-4d89-8836-f19a24c396f0","Type":"ContainerDied","Data":"d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc"} Sep 30 18:46:42 crc kubenswrapper[4772]: I0930 18:46:42.064091 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:46:44 crc kubenswrapper[4772]: I0930 18:46:44.063421 4772 generic.go:334] "Generic (PLEG): container finished" podID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerID="4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8" exitCode=0 Sep 30 18:46:44 crc kubenswrapper[4772]: I0930 18:46:44.063486 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbmbw" event={"ID":"630f1a25-f9d3-4d89-8836-f19a24c396f0","Type":"ContainerDied","Data":"4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8"} Sep 30 18:46:45 crc kubenswrapper[4772]: I0930 18:46:45.088967 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbmbw" event={"ID":"630f1a25-f9d3-4d89-8836-f19a24c396f0","Type":"ContainerStarted","Data":"73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84"} Sep 30 18:46:45 crc kubenswrapper[4772]: I0930 18:46:45.125439 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vbmbw" podStartSLOduration=2.576754644 podStartE2EDuration="5.125406716s" podCreationTimestamp="2025-09-30 18:46:40 +0000 UTC" firstStartedPulling="2025-09-30 18:46:42.063634921 +0000 UTC m=+6302.970647742" lastFinishedPulling="2025-09-30 18:46:44.612286983 +0000 UTC m=+6305.519299814" observedRunningTime="2025-09-30 18:46:45.113728048 +0000 UTC m=+6306.020740889" watchObservedRunningTime="2025-09-30 18:46:45.125406716 +0000 UTC m=+6306.032419567" Sep 30 18:46:50 crc kubenswrapper[4772]: I0930 18:46:50.524512 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:50 crc kubenswrapper[4772]: I0930 18:46:50.529130 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:50 crc kubenswrapper[4772]: I0930 18:46:50.616323 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:51 crc kubenswrapper[4772]: I0930 18:46:51.248861 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:51 crc kubenswrapper[4772]: I0930 18:46:51.313746 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbmbw"] Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.198103 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vbmbw" podUID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerName="registry-server" containerID="cri-o://73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84" gracePeriod=2 Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.700767 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.862893 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-utilities\") pod \"630f1a25-f9d3-4d89-8836-f19a24c396f0\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.863417 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-catalog-content\") pod \"630f1a25-f9d3-4d89-8836-f19a24c396f0\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.863797 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf7ls\" (UniqueName: \"kubernetes.io/projected/630f1a25-f9d3-4d89-8836-f19a24c396f0-kube-api-access-qf7ls\") pod \"630f1a25-f9d3-4d89-8836-f19a24c396f0\" (UID: \"630f1a25-f9d3-4d89-8836-f19a24c396f0\") " Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.864588 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-utilities" (OuterVolumeSpecName: "utilities") pod "630f1a25-f9d3-4d89-8836-f19a24c396f0" (UID: "630f1a25-f9d3-4d89-8836-f19a24c396f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.871699 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630f1a25-f9d3-4d89-8836-f19a24c396f0-kube-api-access-qf7ls" (OuterVolumeSpecName: "kube-api-access-qf7ls") pod "630f1a25-f9d3-4d89-8836-f19a24c396f0" (UID: "630f1a25-f9d3-4d89-8836-f19a24c396f0"). InnerVolumeSpecName "kube-api-access-qf7ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.915443 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "630f1a25-f9d3-4d89-8836-f19a24c396f0" (UID: "630f1a25-f9d3-4d89-8836-f19a24c396f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.967758 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf7ls\" (UniqueName: \"kubernetes.io/projected/630f1a25-f9d3-4d89-8836-f19a24c396f0-kube-api-access-qf7ls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.967834 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:53 crc kubenswrapper[4772]: I0930 18:46:53.967854 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/630f1a25-f9d3-4d89-8836-f19a24c396f0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.219701 4772 generic.go:334] "Generic (PLEG): container finished" podID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerID="73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84" exitCode=0 Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.219767 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbmbw" event={"ID":"630f1a25-f9d3-4d89-8836-f19a24c396f0","Type":"ContainerDied","Data":"73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84"} Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.219809 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbmbw" event={"ID":"630f1a25-f9d3-4d89-8836-f19a24c396f0","Type":"ContainerDied","Data":"1e1b341a503265a8b2aae2a057f565384787c9a4ffcfebb582ed75c2649779d4"} Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.219836 4772 scope.go:117] "RemoveContainer" containerID="73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84" Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.220033 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbmbw" Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.282945 4772 scope.go:117] "RemoveContainer" containerID="4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8" Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.293110 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbmbw"] Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.381763 4772 scope.go:117] "RemoveContainer" containerID="d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc" Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.388751 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vbmbw"] Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.453435 4772 scope.go:117] "RemoveContainer" containerID="73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84" Sep 30 18:46:54 crc kubenswrapper[4772]: E0930 18:46:54.465756 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84\": container with ID starting with 73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84 not found: ID does not exist" containerID="73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84" Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.465885 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84"} err="failed to get container status \"73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84\": rpc error: code = NotFound desc = could not find container \"73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84\": container with ID starting with 73256f3c3ace47009a0637cf8bbb2d163d4fd0c26443c4ac665c1b0a75ef4a84 not found: ID does not exist" Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.465958 4772 scope.go:117] "RemoveContainer" containerID="4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8" Sep 30 18:46:54 crc kubenswrapper[4772]: E0930 18:46:54.473670 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8\": container with ID starting with 4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8 not found: ID does not exist" containerID="4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8" Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.473780 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8"} err="failed to get container status \"4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8\": rpc error: code = NotFound desc = could not find container \"4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8\": container with ID starting with 4a2bfd6c24751da544cf6d02724145062ba67c8a8bcf5b6cd69f830b1a9ab7c8 not found: ID does not exist" Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.473827 4772 scope.go:117] "RemoveContainer" containerID="d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc" Sep 30 18:46:54 crc kubenswrapper[4772]: E0930 18:46:54.475166 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc\": container with ID starting with d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc not found: ID does not exist" containerID="d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc" Sep 30 18:46:54 crc kubenswrapper[4772]: I0930 18:46:54.475202 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc"} err="failed to get container status \"d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc\": rpc error: code = NotFound desc = could not find container \"d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc\": container with ID starting with d41309a91d57f4634b9de2e6eb247ad771e5f509ee0cd9f835400bc739b3b6dc not found: ID does not exist" Sep 30 18:46:55 crc kubenswrapper[4772]: I0930 18:46:55.913850 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630f1a25-f9d3-4d89-8836-f19a24c396f0" path="/var/lib/kubelet/pods/630f1a25-f9d3-4d89-8836-f19a24c396f0/volumes" Sep 30 18:47:08 crc kubenswrapper[4772]: I0930 18:47:08.655851 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:47:08 crc kubenswrapper[4772]: I0930 18:47:08.657197 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:47:38 crc kubenswrapper[4772]: I0930 18:47:38.655914 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:47:38 crc kubenswrapper[4772]: I0930 18:47:38.656996 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:48:08 crc kubenswrapper[4772]: I0930 18:48:08.656156 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:48:08 crc kubenswrapper[4772]: I0930 18:48:08.657228 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:48:08 crc kubenswrapper[4772]: I0930 18:48:08.657291 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 18:48:08 crc kubenswrapper[4772]: I0930 18:48:08.658457 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1367294882a2d9c33499e14d2032a9440594dbec5994447bf3fc56ceb9196d06"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:48:08 crc kubenswrapper[4772]: I0930 18:48:08.658555 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://1367294882a2d9c33499e14d2032a9440594dbec5994447bf3fc56ceb9196d06" gracePeriod=600 Sep 30 18:48:09 crc kubenswrapper[4772]: I0930 18:48:09.156910 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="1367294882a2d9c33499e14d2032a9440594dbec5994447bf3fc56ceb9196d06" exitCode=0 Sep 30 18:48:09 crc kubenswrapper[4772]: I0930 18:48:09.157429 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"1367294882a2d9c33499e14d2032a9440594dbec5994447bf3fc56ceb9196d06"} Sep 30 18:48:09 crc kubenswrapper[4772]: I0930 18:48:09.157473 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d"} Sep 30 18:48:09 crc kubenswrapper[4772]: I0930 18:48:09.157506 4772 scope.go:117] "RemoveContainer" containerID="338d87418168fd6eb7884bd305f23a485e53c465f890cde709de1ed83bebeb1f" Sep 30 18:50:38 crc kubenswrapper[4772]: I0930 18:50:38.655796 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:50:38 crc kubenswrapper[4772]: I0930 18:50:38.656505 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:51:08 crc kubenswrapper[4772]: I0930 18:51:08.656175 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:51:08 crc kubenswrapper[4772]: I0930 18:51:08.656874 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:51:38 crc kubenswrapper[4772]: I0930 18:51:38.655369 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:51:38 crc kubenswrapper[4772]: I0930 18:51:38.656391 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:51:38 crc kubenswrapper[4772]: I0930 18:51:38.656464 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 18:51:38 crc kubenswrapper[4772]: I0930 18:51:38.658743 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:51:38 crc kubenswrapper[4772]: I0930 18:51:38.658865 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" gracePeriod=600 Sep 30 18:51:38 crc kubenswrapper[4772]: E0930 18:51:38.788938 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:51:39 crc kubenswrapper[4772]: I0930 18:51:39.709385 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" exitCode=0 Sep 30 18:51:39 crc kubenswrapper[4772]: I0930 18:51:39.709455 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d"} Sep 30 18:51:39 crc kubenswrapper[4772]: I0930 18:51:39.709794 4772 scope.go:117] "RemoveContainer" containerID="1367294882a2d9c33499e14d2032a9440594dbec5994447bf3fc56ceb9196d06" Sep 30 18:51:39 crc kubenswrapper[4772]: I0930 18:51:39.710633 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:51:39 crc kubenswrapper[4772]: E0930 18:51:39.710898 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:51:49 crc kubenswrapper[4772]: I0930 18:51:49.913437 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:51:49 crc kubenswrapper[4772]: E0930 18:51:49.914590 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.077143 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4d25m"] Sep 30 18:51:55 crc kubenswrapper[4772]: E0930 18:51:55.080142 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerName="extract-utilities" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.080246 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerName="extract-utilities" Sep 30 18:51:55 crc kubenswrapper[4772]: E0930 18:51:55.080651 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerName="extract-content" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.080723 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerName="extract-content" Sep 30 18:51:55 crc kubenswrapper[4772]: E0930 18:51:55.080797 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerName="registry-server" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.080854 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerName="registry-server" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.081136 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="630f1a25-f9d3-4d89-8836-f19a24c396f0" containerName="registry-server" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.082818 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.103151 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4d25m"] Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.238893 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-utilities\") pod \"certified-operators-4d25m\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.239127 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-catalog-content\") pod \"certified-operators-4d25m\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.239477 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ns2\" (UniqueName: \"kubernetes.io/projected/65e099b1-cfac-4b7a-83cd-ba152f37e810-kube-api-access-24ns2\") pod \"certified-operators-4d25m\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.342717 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-utilities\") pod \"certified-operators-4d25m\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.342805 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-catalog-content\") pod \"certified-operators-4d25m\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.342862 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ns2\" (UniqueName: \"kubernetes.io/projected/65e099b1-cfac-4b7a-83cd-ba152f37e810-kube-api-access-24ns2\") pod \"certified-operators-4d25m\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.343307 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-utilities\") pod \"certified-operators-4d25m\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.343370 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-catalog-content\") pod \"certified-operators-4d25m\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.377273 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ns2\" (UniqueName: \"kubernetes.io/projected/65e099b1-cfac-4b7a-83cd-ba152f37e810-kube-api-access-24ns2\") pod \"certified-operators-4d25m\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:55 crc kubenswrapper[4772]: I0930 18:51:55.415729 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:51:56 crc kubenswrapper[4772]: I0930 18:51:56.156030 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4d25m"] Sep 30 18:51:56 crc kubenswrapper[4772]: I0930 18:51:56.922015 4772 generic.go:334] "Generic (PLEG): container finished" podID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerID="1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8" exitCode=0 Sep 30 18:51:56 crc kubenswrapper[4772]: I0930 18:51:56.922520 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d25m" event={"ID":"65e099b1-cfac-4b7a-83cd-ba152f37e810","Type":"ContainerDied","Data":"1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8"} Sep 30 18:51:56 crc kubenswrapper[4772]: I0930 18:51:56.922552 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d25m" event={"ID":"65e099b1-cfac-4b7a-83cd-ba152f37e810","Type":"ContainerStarted","Data":"91b0d46dea7e0916bcaa66a837b340c88726958a93b3d3a235d9b4a497f9479b"} Sep 30 18:51:56 crc kubenswrapper[4772]: I0930 18:51:56.927081 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:51:57 crc kubenswrapper[4772]: I0930 18:51:57.984870 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d25m" event={"ID":"65e099b1-cfac-4b7a-83cd-ba152f37e810","Type":"ContainerStarted","Data":"e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248"} Sep 30 18:51:58 crc kubenswrapper[4772]: I0930 18:51:58.999206 4772 generic.go:334] "Generic (PLEG): container finished" podID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerID="e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248" exitCode=0 Sep 30 18:51:58 crc kubenswrapper[4772]: I0930 18:51:58.999311 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d25m" event={"ID":"65e099b1-cfac-4b7a-83cd-ba152f37e810","Type":"ContainerDied","Data":"e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248"} Sep 30 18:52:00 crc kubenswrapper[4772]: I0930 18:52:00.012019 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d25m" event={"ID":"65e099b1-cfac-4b7a-83cd-ba152f37e810","Type":"ContainerStarted","Data":"41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2"} Sep 30 18:52:00 crc kubenswrapper[4772]: I0930 18:52:00.036225 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4d25m" podStartSLOduration=2.484623729 podStartE2EDuration="5.036208835s" podCreationTimestamp="2025-09-30 18:51:55 +0000 UTC" firstStartedPulling="2025-09-30 18:51:56.924804603 +0000 UTC m=+6617.831817434" lastFinishedPulling="2025-09-30 18:51:59.476389699 +0000 UTC m=+6620.383402540" observedRunningTime="2025-09-30 18:52:00.036075331 +0000 UTC m=+6620.943088162" watchObservedRunningTime="2025-09-30 18:52:00.036208835 +0000 UTC m=+6620.943221666" Sep 30 18:52:02 crc kubenswrapper[4772]: I0930 18:52:02.898655 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:52:02 crc kubenswrapper[4772]: E0930 18:52:02.900023 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:52:05 crc kubenswrapper[4772]: I0930 18:52:05.416496 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:52:05 crc kubenswrapper[4772]: I0930 18:52:05.417002 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:52:05 crc kubenswrapper[4772]: I0930 18:52:05.474837 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:52:06 crc kubenswrapper[4772]: I0930 18:52:06.141442 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:52:06 crc kubenswrapper[4772]: I0930 18:52:06.199955 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4d25m"] Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.108747 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4d25m" podUID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerName="registry-server" containerID="cri-o://41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2" gracePeriod=2 Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.651603 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.657435 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-catalog-content\") pod \"65e099b1-cfac-4b7a-83cd-ba152f37e810\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.657716 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24ns2\" (UniqueName: \"kubernetes.io/projected/65e099b1-cfac-4b7a-83cd-ba152f37e810-kube-api-access-24ns2\") pod \"65e099b1-cfac-4b7a-83cd-ba152f37e810\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.657839 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-utilities\") pod \"65e099b1-cfac-4b7a-83cd-ba152f37e810\" (UID: \"65e099b1-cfac-4b7a-83cd-ba152f37e810\") " Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.659465 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-utilities" (OuterVolumeSpecName: "utilities") pod "65e099b1-cfac-4b7a-83cd-ba152f37e810" (UID: "65e099b1-cfac-4b7a-83cd-ba152f37e810"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.665748 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e099b1-cfac-4b7a-83cd-ba152f37e810-kube-api-access-24ns2" (OuterVolumeSpecName: "kube-api-access-24ns2") pod "65e099b1-cfac-4b7a-83cd-ba152f37e810" (UID: "65e099b1-cfac-4b7a-83cd-ba152f37e810"). InnerVolumeSpecName "kube-api-access-24ns2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.722154 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65e099b1-cfac-4b7a-83cd-ba152f37e810" (UID: "65e099b1-cfac-4b7a-83cd-ba152f37e810"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.761028 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24ns2\" (UniqueName: \"kubernetes.io/projected/65e099b1-cfac-4b7a-83cd-ba152f37e810-kube-api-access-24ns2\") on node \"crc\" DevicePath \"\"" Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.761073 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:52:08 crc kubenswrapper[4772]: I0930 18:52:08.761085 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e099b1-cfac-4b7a-83cd-ba152f37e810-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.134600 4772 generic.go:334] "Generic (PLEG): container finished" podID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerID="41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2" exitCode=0 Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.134676 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d25m" event={"ID":"65e099b1-cfac-4b7a-83cd-ba152f37e810","Type":"ContainerDied","Data":"41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2"} Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.135193 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4d25m" event={"ID":"65e099b1-cfac-4b7a-83cd-ba152f37e810","Type":"ContainerDied","Data":"91b0d46dea7e0916bcaa66a837b340c88726958a93b3d3a235d9b4a497f9479b"} Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.134703 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4d25m" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.135240 4772 scope.go:117] "RemoveContainer" containerID="41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.170222 4772 scope.go:117] "RemoveContainer" containerID="e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.195523 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4d25m"] Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.204071 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4d25m"] Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.204702 4772 scope.go:117] "RemoveContainer" containerID="1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.258778 4772 scope.go:117] "RemoveContainer" containerID="41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2" Sep 30 18:52:09 crc kubenswrapper[4772]: E0930 18:52:09.259382 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2\": container with ID starting with 41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2 not found: ID does not exist" containerID="41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.259427 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2"} err="failed to get container status \"41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2\": rpc error: code = NotFound desc = could not find container \"41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2\": container with ID starting with 41443263ab568b0688437f31d0c321ccfd2289235bb3bea67fc38c5c9394c4e2 not found: ID does not exist" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.259457 4772 scope.go:117] "RemoveContainer" containerID="e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248" Sep 30 18:52:09 crc kubenswrapper[4772]: E0930 18:52:09.259699 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248\": container with ID starting with e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248 not found: ID does not exist" containerID="e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.259721 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248"} err="failed to get container status \"e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248\": rpc error: code = NotFound desc = could not find container \"e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248\": container with ID starting with e17d87fc3f428aacc53c0671d40b9edba57229c93a957f3c1bd703a9680a9248 not found: ID does not exist" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.259738 4772 scope.go:117] "RemoveContainer" containerID="1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8" Sep 30 18:52:09 crc kubenswrapper[4772]: E0930 18:52:09.260267 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8\": container with ID starting with 1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8 not found: ID does not exist" containerID="1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.260369 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8"} err="failed to get container status \"1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8\": rpc error: code = NotFound desc = could not find container \"1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8\": container with ID starting with 1bd23d448ec867d2cbbe7d45548ae131bc319b843f4a3ec1acb95d0ef13d52c8 not found: ID does not exist" Sep 30 18:52:09 crc kubenswrapper[4772]: I0930 18:52:09.917278 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e099b1-cfac-4b7a-83cd-ba152f37e810" path="/var/lib/kubelet/pods/65e099b1-cfac-4b7a-83cd-ba152f37e810/volumes" Sep 30 18:52:16 crc kubenswrapper[4772]: I0930 18:52:16.898974 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:52:16 crc kubenswrapper[4772]: E0930 18:52:16.901609 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:52:31 crc kubenswrapper[4772]: I0930 18:52:31.898202 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:52:31 crc kubenswrapper[4772]: E0930 18:52:31.899177 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:52:42 crc kubenswrapper[4772]: I0930 18:52:42.899201 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:52:42 crc kubenswrapper[4772]: E0930 18:52:42.900286 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:52:57 crc kubenswrapper[4772]: I0930 18:52:57.899484 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:52:57 crc kubenswrapper[4772]: E0930 18:52:57.901227 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:53:08 crc kubenswrapper[4772]: I0930 18:53:08.899486 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:53:08 crc kubenswrapper[4772]: E0930 18:53:08.900977 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:53:21 crc kubenswrapper[4772]: I0930 18:53:21.898971 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:53:21 crc kubenswrapper[4772]: E0930 18:53:21.900843 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:53:33 crc kubenswrapper[4772]: I0930 18:53:33.899235 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:53:33 crc kubenswrapper[4772]: E0930 18:53:33.900286 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:53:46 crc kubenswrapper[4772]: I0930 18:53:46.898733 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:53:46 crc kubenswrapper[4772]: E0930 18:53:46.899914 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:53:57 crc kubenswrapper[4772]: I0930 18:53:57.899585 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:53:57 crc kubenswrapper[4772]: E0930 18:53:57.901354 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:54:08 crc kubenswrapper[4772]: I0930 18:54:08.898787 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:54:08 crc kubenswrapper[4772]: E0930 18:54:08.899982 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:54:23 crc kubenswrapper[4772]: I0930 18:54:23.898542 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:54:23 crc kubenswrapper[4772]: E0930 18:54:23.899421 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.327994 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m29xq"] Sep 30 18:54:24 crc kubenswrapper[4772]: E0930 18:54:24.328709 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerName="extract-content" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.328740 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerName="extract-content" Sep 30 18:54:24 crc kubenswrapper[4772]: E0930 18:54:24.328776 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerName="extract-utilities" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.328785 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerName="extract-utilities" Sep 30 18:54:24 crc kubenswrapper[4772]: E0930 18:54:24.328815 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerName="registry-server" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.328824 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerName="registry-server" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.329149 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e099b1-cfac-4b7a-83cd-ba152f37e810" containerName="registry-server" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.331529 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.344578 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m29xq"] Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.454390 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-576pn\" (UniqueName: \"kubernetes.io/projected/025dc2d3-ca81-4c43-8089-ca15f8fa8769-kube-api-access-576pn\") pod \"redhat-operators-m29xq\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.454873 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-catalog-content\") pod \"redhat-operators-m29xq\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.454958 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-utilities\") pod \"redhat-operators-m29xq\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.557721 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-utilities\") pod \"redhat-operators-m29xq\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.558232 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-576pn\" (UniqueName: \"kubernetes.io/projected/025dc2d3-ca81-4c43-8089-ca15f8fa8769-kube-api-access-576pn\") pod \"redhat-operators-m29xq\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.558390 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-catalog-content\") pod \"redhat-operators-m29xq\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.558475 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-utilities\") pod \"redhat-operators-m29xq\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.559034 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-catalog-content\") pod \"redhat-operators-m29xq\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.584775 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-576pn\" (UniqueName: \"kubernetes.io/projected/025dc2d3-ca81-4c43-8089-ca15f8fa8769-kube-api-access-576pn\") pod \"redhat-operators-m29xq\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:24 crc kubenswrapper[4772]: I0930 18:54:24.672809 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:25 crc kubenswrapper[4772]: I0930 18:54:25.184780 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m29xq"] Sep 30 18:54:25 crc kubenswrapper[4772]: I0930 18:54:25.875283 4772 generic.go:334] "Generic (PLEG): container finished" podID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerID="46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0" exitCode=0 Sep 30 18:54:25 crc kubenswrapper[4772]: I0930 18:54:25.875372 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m29xq" event={"ID":"025dc2d3-ca81-4c43-8089-ca15f8fa8769","Type":"ContainerDied","Data":"46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0"} Sep 30 18:54:25 crc kubenswrapper[4772]: I0930 18:54:25.875444 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m29xq" event={"ID":"025dc2d3-ca81-4c43-8089-ca15f8fa8769","Type":"ContainerStarted","Data":"b6935022980eaf35d0e71e48041cd8152ab6f47553ce257798aee63c50f7737a"} Sep 30 18:54:27 crc kubenswrapper[4772]: I0930 18:54:27.952284 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m29xq" event={"ID":"025dc2d3-ca81-4c43-8089-ca15f8fa8769","Type":"ContainerStarted","Data":"34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d"} Sep 30 18:54:29 crc kubenswrapper[4772]: I0930 18:54:29.940945 4772 generic.go:334] "Generic (PLEG): container finished" podID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerID="34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d" exitCode=0 Sep 30 18:54:29 crc kubenswrapper[4772]: I0930 18:54:29.941124 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m29xq" event={"ID":"025dc2d3-ca81-4c43-8089-ca15f8fa8769","Type":"ContainerDied","Data":"34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d"} Sep 30 18:54:30 crc kubenswrapper[4772]: I0930 18:54:30.953476 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m29xq" event={"ID":"025dc2d3-ca81-4c43-8089-ca15f8fa8769","Type":"ContainerStarted","Data":"c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7"} Sep 30 18:54:30 crc kubenswrapper[4772]: I0930 18:54:30.983037 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m29xq" podStartSLOduration=2.450584724 podStartE2EDuration="6.983008684s" podCreationTimestamp="2025-09-30 18:54:24 +0000 UTC" firstStartedPulling="2025-09-30 18:54:25.87870056 +0000 UTC m=+6766.785713391" lastFinishedPulling="2025-09-30 18:54:30.41112451 +0000 UTC m=+6771.318137351" observedRunningTime="2025-09-30 18:54:30.972926677 +0000 UTC m=+6771.879939558" watchObservedRunningTime="2025-09-30 18:54:30.983008684 +0000 UTC m=+6771.890021525" Sep 30 18:54:34 crc kubenswrapper[4772]: I0930 18:54:34.673587 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:34 crc kubenswrapper[4772]: I0930 18:54:34.674238 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:34 crc kubenswrapper[4772]: I0930 18:54:34.898371 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:54:34 crc kubenswrapper[4772]: E0930 18:54:34.898717 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:54:35 crc kubenswrapper[4772]: I0930 18:54:35.780957 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m29xq" podUID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerName="registry-server" probeResult="failure" output=< Sep 30 18:54:35 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Sep 30 18:54:35 crc kubenswrapper[4772]: > Sep 30 18:54:44 crc kubenswrapper[4772]: I0930 18:54:44.730580 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:44 crc kubenswrapper[4772]: I0930 18:54:44.798072 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:44 crc kubenswrapper[4772]: I0930 18:54:44.971130 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m29xq"] Sep 30 18:54:45 crc kubenswrapper[4772]: I0930 18:54:45.139693 4772 generic.go:334] "Generic (PLEG): container finished" podID="10f9355c-b2c3-4893-86db-91551575a21e" containerID="53f755913376d077923ed246a5ea241b2b43eb5d27ad6bc405aefa9115e3534e" exitCode=0 Sep 30 18:54:45 crc kubenswrapper[4772]: I0930 18:54:45.139791 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"10f9355c-b2c3-4893-86db-91551575a21e","Type":"ContainerDied","Data":"53f755913376d077923ed246a5ea241b2b43eb5d27ad6bc405aefa9115e3534e"} Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.152382 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m29xq" podUID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerName="registry-server" containerID="cri-o://c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7" gracePeriod=2 Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.598150 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.669041 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ssh-key\") pod \"10f9355c-b2c3-4893-86db-91551575a21e\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.669271 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ca-certs\") pod \"10f9355c-b2c3-4893-86db-91551575a21e\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.669320 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config-secret\") pod \"10f9355c-b2c3-4893-86db-91551575a21e\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.669345 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-config-data\") pod \"10f9355c-b2c3-4893-86db-91551575a21e\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.669383 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-workdir\") pod \"10f9355c-b2c3-4893-86db-91551575a21e\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.669415 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-temporary\") pod \"10f9355c-b2c3-4893-86db-91551575a21e\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.669466 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config\") pod \"10f9355c-b2c3-4893-86db-91551575a21e\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.669505 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"10f9355c-b2c3-4893-86db-91551575a21e\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.669529 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54cht\" (UniqueName: \"kubernetes.io/projected/10f9355c-b2c3-4893-86db-91551575a21e-kube-api-access-54cht\") pod \"10f9355c-b2c3-4893-86db-91551575a21e\" (UID: \"10f9355c-b2c3-4893-86db-91551575a21e\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.670991 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-config-data" (OuterVolumeSpecName: "config-data") pod "10f9355c-b2c3-4893-86db-91551575a21e" (UID: "10f9355c-b2c3-4893-86db-91551575a21e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.671174 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "10f9355c-b2c3-4893-86db-91551575a21e" (UID: "10f9355c-b2c3-4893-86db-91551575a21e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.674961 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "10f9355c-b2c3-4893-86db-91551575a21e" (UID: "10f9355c-b2c3-4893-86db-91551575a21e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.675881 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "10f9355c-b2c3-4893-86db-91551575a21e" (UID: "10f9355c-b2c3-4893-86db-91551575a21e"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.676221 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f9355c-b2c3-4893-86db-91551575a21e-kube-api-access-54cht" (OuterVolumeSpecName: "kube-api-access-54cht") pod "10f9355c-b2c3-4893-86db-91551575a21e" (UID: "10f9355c-b2c3-4893-86db-91551575a21e"). InnerVolumeSpecName "kube-api-access-54cht". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.703462 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "10f9355c-b2c3-4893-86db-91551575a21e" (UID: "10f9355c-b2c3-4893-86db-91551575a21e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.709237 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "10f9355c-b2c3-4893-86db-91551575a21e" (UID: "10f9355c-b2c3-4893-86db-91551575a21e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.718165 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "10f9355c-b2c3-4893-86db-91551575a21e" (UID: "10f9355c-b2c3-4893-86db-91551575a21e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.741693 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "10f9355c-b2c3-4893-86db-91551575a21e" (UID: "10f9355c-b2c3-4893-86db-91551575a21e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.748126 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.772131 4772 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.772208 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.772222 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54cht\" (UniqueName: \"kubernetes.io/projected/10f9355c-b2c3-4893-86db-91551575a21e-kube-api-access-54cht\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.772233 4772 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.772242 4772 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.772251 4772 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/10f9355c-b2c3-4893-86db-91551575a21e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.772259 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10f9355c-b2c3-4893-86db-91551575a21e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.772268 4772 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.772279 4772 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/10f9355c-b2c3-4893-86db-91551575a21e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.800167 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.873970 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-utilities\") pod \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.874197 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-576pn\" (UniqueName: \"kubernetes.io/projected/025dc2d3-ca81-4c43-8089-ca15f8fa8769-kube-api-access-576pn\") pod \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.874236 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-catalog-content\") pod \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\" (UID: \"025dc2d3-ca81-4c43-8089-ca15f8fa8769\") " Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.875016 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.875946 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-utilities" (OuterVolumeSpecName: "utilities") pod "025dc2d3-ca81-4c43-8089-ca15f8fa8769" (UID: "025dc2d3-ca81-4c43-8089-ca15f8fa8769"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.878562 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025dc2d3-ca81-4c43-8089-ca15f8fa8769-kube-api-access-576pn" (OuterVolumeSpecName: "kube-api-access-576pn") pod "025dc2d3-ca81-4c43-8089-ca15f8fa8769" (UID: "025dc2d3-ca81-4c43-8089-ca15f8fa8769"). InnerVolumeSpecName "kube-api-access-576pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.966710 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "025dc2d3-ca81-4c43-8089-ca15f8fa8769" (UID: "025dc2d3-ca81-4c43-8089-ca15f8fa8769"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.977338 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.977379 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-576pn\" (UniqueName: \"kubernetes.io/projected/025dc2d3-ca81-4c43-8089-ca15f8fa8769-kube-api-access-576pn\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:46 crc kubenswrapper[4772]: I0930 18:54:46.977394 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025dc2d3-ca81-4c43-8089-ca15f8fa8769-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.166124 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"10f9355c-b2c3-4893-86db-91551575a21e","Type":"ContainerDied","Data":"bc036abe7e0db14bc785ff0de32416a3930e7ab5149c3c53649b01dec24ac112"} Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.166172 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc036abe7e0db14bc785ff0de32416a3930e7ab5149c3c53649b01dec24ac112" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.166183 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.170412 4772 generic.go:334] "Generic (PLEG): container finished" podID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerID="c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7" exitCode=0 Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.170446 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m29xq" event={"ID":"025dc2d3-ca81-4c43-8089-ca15f8fa8769","Type":"ContainerDied","Data":"c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7"} Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.170463 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m29xq" event={"ID":"025dc2d3-ca81-4c43-8089-ca15f8fa8769","Type":"ContainerDied","Data":"b6935022980eaf35d0e71e48041cd8152ab6f47553ce257798aee63c50f7737a"} Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.170481 4772 scope.go:117] "RemoveContainer" containerID="c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.170644 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m29xq" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.193626 4772 scope.go:117] "RemoveContainer" containerID="34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.218642 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m29xq"] Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.227018 4772 scope.go:117] "RemoveContainer" containerID="46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.235849 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m29xq"] Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.260380 4772 scope.go:117] "RemoveContainer" containerID="c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7" Sep 30 18:54:47 crc kubenswrapper[4772]: E0930 18:54:47.261571 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7\": container with ID starting with c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7 not found: ID does not exist" containerID="c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.261987 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7"} err="failed to get container status \"c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7\": rpc error: code = NotFound desc = could not find container \"c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7\": container with ID starting with c9789e15503a286cf39b1795a79bd2cf71fd014b8b176d9c1cbc6d063cfa0bc7 not found: ID does not exist" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.262027 4772 scope.go:117] "RemoveContainer" containerID="34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d" Sep 30 18:54:47 crc kubenswrapper[4772]: E0930 18:54:47.263079 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d\": container with ID starting with 34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d not found: ID does not exist" containerID="34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.263131 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d"} err="failed to get container status \"34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d\": rpc error: code = NotFound desc = could not find container \"34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d\": container with ID starting with 34968c86dda868f46384747cf5833d4887deea3ee3dc8e35abb278cd58e5e19d not found: ID does not exist" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.267593 4772 scope.go:117] "RemoveContainer" containerID="46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0" Sep 30 18:54:47 crc kubenswrapper[4772]: E0930 18:54:47.268983 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0\": container with ID starting with 46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0 not found: ID does not exist" containerID="46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.269037 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0"} err="failed to get container status \"46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0\": rpc error: code = NotFound desc = could not find container \"46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0\": container with ID starting with 46fabbf8f4b6021177249cebef760dbd42916c7b4bed424348a710bc179fceb0 not found: ID does not exist" Sep 30 18:54:47 crc kubenswrapper[4772]: I0930 18:54:47.917794 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" path="/var/lib/kubelet/pods/025dc2d3-ca81-4c43-8089-ca15f8fa8769/volumes" Sep 30 18:54:48 crc kubenswrapper[4772]: I0930 18:54:48.899220 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:54:48 crc kubenswrapper[4772]: E0930 18:54:48.899817 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.749211 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 18:54:51 crc kubenswrapper[4772]: E0930 18:54:51.750526 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerName="extract-utilities" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.750541 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerName="extract-utilities" Sep 30 18:54:51 crc kubenswrapper[4772]: E0930 18:54:51.750566 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f9355c-b2c3-4893-86db-91551575a21e" containerName="tempest-tests-tempest-tests-runner" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.750575 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f9355c-b2c3-4893-86db-91551575a21e" containerName="tempest-tests-tempest-tests-runner" Sep 30 18:54:51 crc kubenswrapper[4772]: E0930 18:54:51.750609 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerName="registry-server" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.750616 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerName="registry-server" Sep 30 18:54:51 crc kubenswrapper[4772]: E0930 18:54:51.750630 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerName="extract-content" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.750635 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerName="extract-content" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.750848 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f9355c-b2c3-4893-86db-91551575a21e" containerName="tempest-tests-tempest-tests-runner" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.750874 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="025dc2d3-ca81-4c43-8089-ca15f8fa8769" containerName="registry-server" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.751806 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.754819 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-t7k22" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.761803 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.787402 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dzqj\" (UniqueName: \"kubernetes.io/projected/8efe2257-d089-4e71-b8cf-e80ca250b5d4-kube-api-access-8dzqj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8efe2257-d089-4e71-b8cf-e80ca250b5d4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.787696 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8efe2257-d089-4e71-b8cf-e80ca250b5d4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.891387 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dzqj\" (UniqueName: \"kubernetes.io/projected/8efe2257-d089-4e71-b8cf-e80ca250b5d4-kube-api-access-8dzqj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8efe2257-d089-4e71-b8cf-e80ca250b5d4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.892639 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8efe2257-d089-4e71-b8cf-e80ca250b5d4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.894277 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8efe2257-d089-4e71-b8cf-e80ca250b5d4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.922134 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dzqj\" (UniqueName: \"kubernetes.io/projected/8efe2257-d089-4e71-b8cf-e80ca250b5d4-kube-api-access-8dzqj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8efe2257-d089-4e71-b8cf-e80ca250b5d4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 18:54:51 crc kubenswrapper[4772]: I0930 18:54:51.942137 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8efe2257-d089-4e71-b8cf-e80ca250b5d4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 18:54:52 crc kubenswrapper[4772]: I0930 18:54:52.080326 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 18:54:52 crc kubenswrapper[4772]: I0930 18:54:52.610543 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 18:54:53 crc kubenswrapper[4772]: I0930 18:54:53.237001 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8efe2257-d089-4e71-b8cf-e80ca250b5d4","Type":"ContainerStarted","Data":"4c3b9955f8fe164b11e65049d9b394d01b60890b40be8c41bbf3d903a1ef724d"} Sep 30 18:54:54 crc kubenswrapper[4772]: I0930 18:54:54.252177 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8efe2257-d089-4e71-b8cf-e80ca250b5d4","Type":"ContainerStarted","Data":"e0784136c3a23b457adbab1d091e1041f6ed34b8db9ff526221a68291b9ac663"} Sep 30 18:54:54 crc kubenswrapper[4772]: I0930 18:54:54.270748 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.298819528 podStartE2EDuration="3.270715446s" podCreationTimestamp="2025-09-30 18:54:51 +0000 UTC" firstStartedPulling="2025-09-30 18:54:52.621094381 +0000 UTC m=+6793.528107212" lastFinishedPulling="2025-09-30 18:54:53.592990289 +0000 UTC m=+6794.500003130" observedRunningTime="2025-09-30 18:54:54.264901717 +0000 UTC m=+6795.171914548" watchObservedRunningTime="2025-09-30 18:54:54.270715446 +0000 UTC m=+6795.177728307" Sep 30 18:55:01 crc kubenswrapper[4772]: I0930 18:55:01.898779 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:55:01 crc kubenswrapper[4772]: E0930 18:55:01.899850 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.779236 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cmjnr/must-gather-tcnzd"] Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.781811 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/must-gather-tcnzd" Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.785136 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-cmjnr"/"default-dockercfg-z2hg2" Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.785392 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cmjnr"/"kube-root-ca.crt" Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.785530 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cmjnr"/"openshift-service-ca.crt" Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.793239 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cmjnr/must-gather-tcnzd"] Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.873583 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcq95\" (UniqueName: \"kubernetes.io/projected/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-kube-api-access-gcq95\") pod \"must-gather-tcnzd\" (UID: \"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd\") " pod="openshift-must-gather-cmjnr/must-gather-tcnzd" Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.874006 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-must-gather-output\") pod \"must-gather-tcnzd\" (UID: \"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd\") " pod="openshift-must-gather-cmjnr/must-gather-tcnzd" Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.976597 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcq95\" (UniqueName: \"kubernetes.io/projected/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-kube-api-access-gcq95\") pod \"must-gather-tcnzd\" (UID: \"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd\") " pod="openshift-must-gather-cmjnr/must-gather-tcnzd" Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.976805 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-must-gather-output\") pod \"must-gather-tcnzd\" (UID: \"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd\") " pod="openshift-must-gather-cmjnr/must-gather-tcnzd" Sep 30 18:55:12 crc kubenswrapper[4772]: I0930 18:55:12.978118 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-must-gather-output\") pod \"must-gather-tcnzd\" (UID: \"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd\") " pod="openshift-must-gather-cmjnr/must-gather-tcnzd" Sep 30 18:55:13 crc kubenswrapper[4772]: I0930 18:55:13.008168 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcq95\" (UniqueName: \"kubernetes.io/projected/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-kube-api-access-gcq95\") pod \"must-gather-tcnzd\" (UID: \"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd\") " pod="openshift-must-gather-cmjnr/must-gather-tcnzd" Sep 30 18:55:13 crc kubenswrapper[4772]: I0930 18:55:13.111293 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/must-gather-tcnzd" Sep 30 18:55:13 crc kubenswrapper[4772]: I0930 18:55:13.657022 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cmjnr/must-gather-tcnzd"] Sep 30 18:55:14 crc kubenswrapper[4772]: I0930 18:55:14.502269 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/must-gather-tcnzd" event={"ID":"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd","Type":"ContainerStarted","Data":"830284e49b7fb69b88808c13c5c20cac06c9f5c0258a1d401bcbd7a1e2409807"} Sep 30 18:55:15 crc kubenswrapper[4772]: I0930 18:55:15.903872 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:55:15 crc kubenswrapper[4772]: E0930 18:55:15.904712 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:55:22 crc kubenswrapper[4772]: I0930 18:55:22.596724 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/must-gather-tcnzd" event={"ID":"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd","Type":"ContainerStarted","Data":"1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b"} Sep 30 18:55:22 crc kubenswrapper[4772]: I0930 18:55:22.597642 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/must-gather-tcnzd" event={"ID":"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd","Type":"ContainerStarted","Data":"dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33"} Sep 30 18:55:22 crc kubenswrapper[4772]: I0930 18:55:22.621092 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cmjnr/must-gather-tcnzd" podStartSLOduration=2.8814568510000003 podStartE2EDuration="10.621045868s" podCreationTimestamp="2025-09-30 18:55:12 +0000 UTC" firstStartedPulling="2025-09-30 18:55:13.667867681 +0000 UTC m=+6814.574880522" lastFinishedPulling="2025-09-30 18:55:21.407456698 +0000 UTC m=+6822.314469539" observedRunningTime="2025-09-30 18:55:22.617324363 +0000 UTC m=+6823.524337194" watchObservedRunningTime="2025-09-30 18:55:22.621045868 +0000 UTC m=+6823.528058699" Sep 30 18:55:26 crc kubenswrapper[4772]: I0930 18:55:26.482284 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cmjnr/crc-debug-nsnnz"] Sep 30 18:55:26 crc kubenswrapper[4772]: I0930 18:55:26.487184 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" Sep 30 18:55:26 crc kubenswrapper[4772]: I0930 18:55:26.651294 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg7lh\" (UniqueName: \"kubernetes.io/projected/ce5c708f-3988-4046-887b-e989203d6ab6-kube-api-access-lg7lh\") pod \"crc-debug-nsnnz\" (UID: \"ce5c708f-3988-4046-887b-e989203d6ab6\") " pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" Sep 30 18:55:26 crc kubenswrapper[4772]: I0930 18:55:26.651719 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce5c708f-3988-4046-887b-e989203d6ab6-host\") pod \"crc-debug-nsnnz\" (UID: \"ce5c708f-3988-4046-887b-e989203d6ab6\") " pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" Sep 30 18:55:26 crc kubenswrapper[4772]: I0930 18:55:26.753804 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg7lh\" (UniqueName: \"kubernetes.io/projected/ce5c708f-3988-4046-887b-e989203d6ab6-kube-api-access-lg7lh\") pod \"crc-debug-nsnnz\" (UID: \"ce5c708f-3988-4046-887b-e989203d6ab6\") " pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" Sep 30 18:55:26 crc kubenswrapper[4772]: I0930 18:55:26.754380 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce5c708f-3988-4046-887b-e989203d6ab6-host\") pod \"crc-debug-nsnnz\" (UID: \"ce5c708f-3988-4046-887b-e989203d6ab6\") " pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" Sep 30 18:55:26 crc kubenswrapper[4772]: I0930 18:55:26.754548 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce5c708f-3988-4046-887b-e989203d6ab6-host\") pod \"crc-debug-nsnnz\" (UID: \"ce5c708f-3988-4046-887b-e989203d6ab6\") " pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" Sep 30 18:55:26 crc kubenswrapper[4772]: I0930 18:55:26.775984 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg7lh\" (UniqueName: \"kubernetes.io/projected/ce5c708f-3988-4046-887b-e989203d6ab6-kube-api-access-lg7lh\") pod \"crc-debug-nsnnz\" (UID: \"ce5c708f-3988-4046-887b-e989203d6ab6\") " pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" Sep 30 18:55:26 crc kubenswrapper[4772]: I0930 18:55:26.808117 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" Sep 30 18:55:26 crc kubenswrapper[4772]: W0930 18:55:26.862395 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce5c708f_3988_4046_887b_e989203d6ab6.slice/crio-16a5cd54fb808964de330ab8eb388689d91b6811dfab6e2b3af69511baa873d2 WatchSource:0}: Error finding container 16a5cd54fb808964de330ab8eb388689d91b6811dfab6e2b3af69511baa873d2: Status 404 returned error can't find the container with id 16a5cd54fb808964de330ab8eb388689d91b6811dfab6e2b3af69511baa873d2 Sep 30 18:55:27 crc kubenswrapper[4772]: I0930 18:55:27.650683 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" event={"ID":"ce5c708f-3988-4046-887b-e989203d6ab6","Type":"ContainerStarted","Data":"16a5cd54fb808964de330ab8eb388689d91b6811dfab6e2b3af69511baa873d2"} Sep 30 18:55:30 crc kubenswrapper[4772]: I0930 18:55:30.899474 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:55:30 crc kubenswrapper[4772]: E0930 18:55:30.900552 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:55:38 crc kubenswrapper[4772]: I0930 18:55:38.799841 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" event={"ID":"ce5c708f-3988-4046-887b-e989203d6ab6","Type":"ContainerStarted","Data":"04a0425951b483ba5b11c16009cd48ea942500fa0de490211dc4c373b9b7a70b"} Sep 30 18:55:38 crc kubenswrapper[4772]: I0930 18:55:38.818739 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" podStartSLOduration=1.572839256 podStartE2EDuration="12.818719299s" podCreationTimestamp="2025-09-30 18:55:26 +0000 UTC" firstStartedPulling="2025-09-30 18:55:26.866078158 +0000 UTC m=+6827.773090989" lastFinishedPulling="2025-09-30 18:55:38.111958201 +0000 UTC m=+6839.018971032" observedRunningTime="2025-09-30 18:55:38.815394604 +0000 UTC m=+6839.722407435" watchObservedRunningTime="2025-09-30 18:55:38.818719299 +0000 UTC m=+6839.725732130" Sep 30 18:55:43 crc kubenswrapper[4772]: I0930 18:55:43.901731 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:55:43 crc kubenswrapper[4772]: E0930 18:55:43.903311 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:55:57 crc kubenswrapper[4772]: I0930 18:55:57.899401 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:55:57 crc kubenswrapper[4772]: E0930 18:55:57.900536 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:56:11 crc kubenswrapper[4772]: I0930 18:56:11.901341 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:56:11 crc kubenswrapper[4772]: E0930 18:56:11.904985 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:56:24 crc kubenswrapper[4772]: I0930 18:56:24.899690 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:56:24 crc kubenswrapper[4772]: E0930 18:56:24.900674 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 18:56:39 crc kubenswrapper[4772]: I0930 18:56:39.913827 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 18:56:40 crc kubenswrapper[4772]: I0930 18:56:40.497007 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"77c9607ed2a0e38d68bcdc15808423e68f6c0324df585a7ce24dade791feb774"} Sep 30 18:57:04 crc kubenswrapper[4772]: I0930 18:57:04.648232 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-859bb54b8b-6n9dj_190ea63c-c6c0-47e8-988c-bd89113ef485/barbican-api/0.log" Sep 30 18:57:04 crc kubenswrapper[4772]: I0930 18:57:04.860883 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-859bb54b8b-6n9dj_190ea63c-c6c0-47e8-988c-bd89113ef485/barbican-api-log/0.log" Sep 30 18:57:05 crc kubenswrapper[4772]: I0930 18:57:05.061927 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f749c9554-fhqsc_84dc6bc7-3f82-4108-afa6-15ac7055676a/barbican-keystone-listener/0.log" Sep 30 18:57:05 crc kubenswrapper[4772]: I0930 18:57:05.279844 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f749c9554-fhqsc_84dc6bc7-3f82-4108-afa6-15ac7055676a/barbican-keystone-listener-log/0.log" Sep 30 18:57:05 crc kubenswrapper[4772]: I0930 18:57:05.480753 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-658db7799c-88bsl_31b84c03-7c14-47a5-9f86-cca25e0bf92e/barbican-worker/0.log" Sep 30 18:57:05 crc kubenswrapper[4772]: I0930 18:57:05.574006 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-658db7799c-88bsl_31b84c03-7c14-47a5-9f86-cca25e0bf92e/barbican-worker-log/0.log" Sep 30 18:57:05 crc kubenswrapper[4772]: I0930 18:57:05.801629 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h_f953fcc8-8726-4ec2-a493-d67f3f540054/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:06 crc kubenswrapper[4772]: I0930 18:57:06.067791 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5/ceilometer-central-agent/0.log" Sep 30 18:57:06 crc kubenswrapper[4772]: I0930 18:57:06.182101 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5/ceilometer-notification-agent/0.log" Sep 30 18:57:06 crc kubenswrapper[4772]: I0930 18:57:06.270071 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5/proxy-httpd/0.log" Sep 30 18:57:06 crc kubenswrapper[4772]: I0930 18:57:06.396852 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5/sg-core/0.log" Sep 30 18:57:06 crc kubenswrapper[4772]: I0930 18:57:06.552900 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc_bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:06 crc kubenswrapper[4772]: I0930 18:57:06.698172 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j_d001e435-b677-46e3-a31b-f5d1ae7e5c01/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:06 crc kubenswrapper[4772]: I0930 18:57:06.950306 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_71eb05f1-a375-49c7-965d-ae495649ac7c/cinder-api-log/0.log" Sep 30 18:57:07 crc kubenswrapper[4772]: I0930 18:57:07.307480 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058/probe/0.log" Sep 30 18:57:07 crc kubenswrapper[4772]: I0930 18:57:07.876637 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2e083d29-4d17-4b01-9201-dfbed0f1f304/cinder-scheduler/0.log" Sep 30 18:57:07 crc kubenswrapper[4772]: I0930 18:57:07.979853 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058/cinder-backup/0.log" Sep 30 18:57:08 crc kubenswrapper[4772]: I0930 18:57:08.007079 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_71eb05f1-a375-49c7-965d-ae495649ac7c/cinder-api/0.log" Sep 30 18:57:08 crc kubenswrapper[4772]: I0930 18:57:08.129412 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2e083d29-4d17-4b01-9201-dfbed0f1f304/probe/0.log" Sep 30 18:57:08 crc kubenswrapper[4772]: I0930 18:57:08.411546 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_b6b0b394-e87c-4287-ab65-5652e2cc09e1/probe/0.log" Sep 30 18:57:08 crc kubenswrapper[4772]: I0930 18:57:08.443518 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_b6b0b394-e87c-4287-ab65-5652e2cc09e1/cinder-volume/0.log" Sep 30 18:57:08 crc kubenswrapper[4772]: I0930 18:57:08.759895 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume2-0_08a95766-93a6-47b7-bce4-c556f7064db0/probe/0.log" Sep 30 18:57:08 crc kubenswrapper[4772]: I0930 18:57:08.837167 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume2-0_08a95766-93a6-47b7-bce4-c556f7064db0/cinder-volume/0.log" Sep 30 18:57:08 crc kubenswrapper[4772]: I0930 18:57:08.940896 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vmjps_f9277e5c-9f8e-4c7c-a979-03fce35dab53/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:09 crc kubenswrapper[4772]: I0930 18:57:09.070533 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx_104de20c-fde6-42d5-aa8b-f23445a3661e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:09 crc kubenswrapper[4772]: I0930 18:57:09.207898 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5498b49c99-7mbh2_51a7ec88-f2d8-434d-88ea-3e3ce6c639c5/init/0.log" Sep 30 18:57:09 crc kubenswrapper[4772]: I0930 18:57:09.427412 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5498b49c99-7mbh2_51a7ec88-f2d8-434d-88ea-3e3ce6c639c5/init/0.log" Sep 30 18:57:09 crc kubenswrapper[4772]: I0930 18:57:09.719732 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7019346d-46b6-4f97-b309-58376e8a2d2a/glance-log/0.log" Sep 30 18:57:09 crc kubenswrapper[4772]: I0930 18:57:09.739549 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7019346d-46b6-4f97-b309-58376e8a2d2a/glance-httpd/0.log" Sep 30 18:57:09 crc kubenswrapper[4772]: I0930 18:57:09.981557 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d3ca2624-92a6-4bcf-bbb6-4780637bef02/glance-httpd/0.log" Sep 30 18:57:10 crc kubenswrapper[4772]: I0930 18:57:10.192323 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d3ca2624-92a6-4bcf-bbb6-4780637bef02/glance-log/0.log" Sep 30 18:57:10 crc kubenswrapper[4772]: I0930 18:57:10.265875 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5498b49c99-7mbh2_51a7ec88-f2d8-434d-88ea-3e3ce6c639c5/dnsmasq-dns/0.log" Sep 30 18:57:10 crc kubenswrapper[4772]: I0930 18:57:10.488938 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dd79c8f84-lx2fj_bb98c606-aef7-46e5-8242-7ebd28d542ba/horizon/0.log" Sep 30 18:57:10 crc kubenswrapper[4772]: I0930 18:57:10.771710 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-t42nj_32420052-34e9-4cca-a4ee-239d3416cd9a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:11 crc kubenswrapper[4772]: I0930 18:57:11.019329 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dd79c8f84-lx2fj_bb98c606-aef7-46e5-8242-7ebd28d542ba/horizon-log/0.log" Sep 30 18:57:11 crc kubenswrapper[4772]: I0930 18:57:11.044351 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-nswzh_71289a51-de10-4dea-8aca-4a3cbd177e65/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:11 crc kubenswrapper[4772]: I0930 18:57:11.394378 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320921-q2m64_3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7/keystone-cron/0.log" Sep 30 18:57:11 crc kubenswrapper[4772]: I0930 18:57:11.666689 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ed142cd3-4d43-4293-af1f-d2a76649b5a2/kube-state-metrics/0.log" Sep 30 18:57:11 crc kubenswrapper[4772]: I0930 18:57:11.813376 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76ff4c9cf5-7gpvg_43dbf436-1404-454d-ab9a-870ba144ade3/keystone-api/0.log" Sep 30 18:57:11 crc kubenswrapper[4772]: I0930 18:57:11.957033 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz_cc4ef050-7f47-4f1f-a62e-4607d290ddf3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:12 crc kubenswrapper[4772]: I0930 18:57:12.798346 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-684cbd44c-xstzf_d878293c-0383-4575-95cb-1062bcb4634e/neutron-api/0.log" Sep 30 18:57:12 crc kubenswrapper[4772]: I0930 18:57:12.799141 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-684cbd44c-xstzf_d878293c-0383-4575-95cb-1062bcb4634e/neutron-httpd/0.log" Sep 30 18:57:13 crc kubenswrapper[4772]: I0930 18:57:13.105978 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf_6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:15 crc kubenswrapper[4772]: I0930 18:57:15.737943 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_19205b6f-4fbc-4114-809f-3f105f8469bb/nova-api-log/0.log" Sep 30 18:57:15 crc kubenswrapper[4772]: I0930 18:57:15.918442 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_19205b6f-4fbc-4114-809f-3f105f8469bb/nova-api-api/0.log" Sep 30 18:57:16 crc kubenswrapper[4772]: I0930 18:57:16.394022 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a242b41e-98a7-4814-984c-70b36be61cb9/nova-cell0-conductor-conductor/0.log" Sep 30 18:57:16 crc kubenswrapper[4772]: I0930 18:57:16.623737 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_be6b14d4-5d20-4cae-add5-4702dc26ecc5/nova-cell1-conductor-conductor/0.log" Sep 30 18:57:17 crc kubenswrapper[4772]: I0930 18:57:17.098729 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_21aeeee6-b52d-4cd0-b635-085708b6e9d9/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 18:57:17 crc kubenswrapper[4772]: I0930 18:57:17.362497 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv_52dddfd0-5fcc-47be-96c2-e3427fc66069/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:17 crc kubenswrapper[4772]: I0930 18:57:17.671180 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_dc608be5-335a-4080-9a63-9266b733dde3/nova-metadata-log/0.log" Sep 30 18:57:18 crc kubenswrapper[4772]: I0930 18:57:18.515671 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_072e2a3c-9da9-4b3d-ab28-05338d20eb88/nova-scheduler-scheduler/0.log" Sep 30 18:57:19 crc kubenswrapper[4772]: I0930 18:57:19.131219 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4b4b3176-3882-486d-8217-54f429906f49/mysql-bootstrap/0.log" Sep 30 18:57:19 crc kubenswrapper[4772]: I0930 18:57:19.360312 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4b4b3176-3882-486d-8217-54f429906f49/mysql-bootstrap/0.log" Sep 30 18:57:19 crc kubenswrapper[4772]: I0930 18:57:19.595073 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4b4b3176-3882-486d-8217-54f429906f49/galera/0.log" Sep 30 18:57:20 crc kubenswrapper[4772]: I0930 18:57:20.170611 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5548eec2-33be-42b2-9b84-572236f095db/mysql-bootstrap/0.log" Sep 30 18:57:20 crc kubenswrapper[4772]: I0930 18:57:20.367731 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5548eec2-33be-42b2-9b84-572236f095db/mysql-bootstrap/0.log" Sep 30 18:57:20 crc kubenswrapper[4772]: I0930 18:57:20.680424 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5548eec2-33be-42b2-9b84-572236f095db/galera/0.log" Sep 30 18:57:21 crc kubenswrapper[4772]: I0930 18:57:21.051360 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_dc608be5-335a-4080-9a63-9266b733dde3/nova-metadata-metadata/0.log" Sep 30 18:57:21 crc kubenswrapper[4772]: I0930 18:57:21.168673 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_45aef289-46c6-4393-9032-2fe923b5948a/openstackclient/0.log" Sep 30 18:57:21 crc kubenswrapper[4772]: I0930 18:57:21.414673 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6v6fm_d66affdf-221c-4a29-a1f7-0c3d7e4d4153/ovn-controller/0.log" Sep 30 18:57:21 crc kubenswrapper[4772]: I0930 18:57:21.670398 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fkkwr_43af7d7d-ee79-4c8c-b4fd-6789a382bab3/openstack-network-exporter/0.log" Sep 30 18:57:21 crc kubenswrapper[4772]: I0930 18:57:21.932601 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5kwk_e052869f-fd26-497b-9573-0ee6221fa96c/ovsdb-server-init/0.log" Sep 30 18:57:22 crc kubenswrapper[4772]: I0930 18:57:22.499946 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5kwk_e052869f-fd26-497b-9573-0ee6221fa96c/ovsdb-server-init/0.log" Sep 30 18:57:22 crc kubenswrapper[4772]: I0930 18:57:22.715712 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5kwk_e052869f-fd26-497b-9573-0ee6221fa96c/ovsdb-server/0.log" Sep 30 18:57:22 crc kubenswrapper[4772]: I0930 18:57:22.969585 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5kwk_e052869f-fd26-497b-9573-0ee6221fa96c/ovs-vswitchd/0.log" Sep 30 18:57:23 crc kubenswrapper[4772]: I0930 18:57:23.204868 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-llw7r_4e366f6f-7ee6-42c4-8a83-7cba085e2a46/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:23 crc kubenswrapper[4772]: I0930 18:57:23.376374 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_10ca909a-0a73-4f62-89a4-ed8ffac99539/openstack-network-exporter/0.log" Sep 30 18:57:23 crc kubenswrapper[4772]: I0930 18:57:23.423995 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_10ca909a-0a73-4f62-89a4-ed8ffac99539/ovn-northd/0.log" Sep 30 18:57:23 crc kubenswrapper[4772]: I0930 18:57:23.654552 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_99ec9fea-a439-415b-ac73-3c4d0242eeb3/openstack-network-exporter/0.log" Sep 30 18:57:23 crc kubenswrapper[4772]: I0930 18:57:23.879932 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_99ec9fea-a439-415b-ac73-3c4d0242eeb3/ovsdbserver-nb/0.log" Sep 30 18:57:24 crc kubenswrapper[4772]: I0930 18:57:24.135413 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_da545add-e15e-4ed4-b084-66691b57284b/openstack-network-exporter/0.log" Sep 30 18:57:24 crc kubenswrapper[4772]: I0930 18:57:24.309585 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_da545add-e15e-4ed4-b084-66691b57284b/ovsdbserver-sb/0.log" Sep 30 18:57:24 crc kubenswrapper[4772]: I0930 18:57:24.820791 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-79fbb4fcd8-68j8v_ef6c9261-05fa-449e-87ba-2c33d858daec/placement-api/0.log" Sep 30 18:57:25 crc kubenswrapper[4772]: I0930 18:57:25.009231 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-79fbb4fcd8-68j8v_ef6c9261-05fa-449e-87ba-2c33d858daec/placement-log/0.log" Sep 30 18:57:25 crc kubenswrapper[4772]: I0930 18:57:25.274706 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8881ab23-9d2d-4563-b838-7b4583805e4f/init-config-reloader/0.log" Sep 30 18:57:25 crc kubenswrapper[4772]: I0930 18:57:25.567500 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8881ab23-9d2d-4563-b838-7b4583805e4f/init-config-reloader/0.log" Sep 30 18:57:25 crc kubenswrapper[4772]: I0930 18:57:25.618633 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8881ab23-9d2d-4563-b838-7b4583805e4f/config-reloader/0.log" Sep 30 18:57:25 crc kubenswrapper[4772]: I0930 18:57:25.850408 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8881ab23-9d2d-4563-b838-7b4583805e4f/prometheus/0.log" Sep 30 18:57:25 crc kubenswrapper[4772]: I0930 18:57:25.928636 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8881ab23-9d2d-4563-b838-7b4583805e4f/thanos-sidecar/0.log" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.159266 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_442ae296-125c-4c92-97b3-f2c04dac157e/setup-container/0.log" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.604674 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_442ae296-125c-4c92-97b3-f2c04dac157e/setup-container/0.log" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.775097 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p8clc"] Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.786267 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p8clc"] Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.792226 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.879801 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-catalog-content\") pod \"community-operators-p8clc\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.879869 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-utilities\") pod \"community-operators-p8clc\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.879963 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfjfq\" (UniqueName: \"kubernetes.io/projected/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-kube-api-access-pfjfq\") pod \"community-operators-p8clc\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.908666 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_442ae296-125c-4c92-97b3-f2c04dac157e/rabbitmq/0.log" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.982223 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfjfq\" (UniqueName: \"kubernetes.io/projected/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-kube-api-access-pfjfq\") pod \"community-operators-p8clc\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.982428 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-catalog-content\") pod \"community-operators-p8clc\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.982506 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-utilities\") pod \"community-operators-p8clc\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.983294 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-utilities\") pod \"community-operators-p8clc\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:26 crc kubenswrapper[4772]: I0930 18:57:26.984307 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-catalog-content\") pod \"community-operators-p8clc\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:27 crc kubenswrapper[4772]: I0930 18:57:27.011433 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfjfq\" (UniqueName: \"kubernetes.io/projected/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-kube-api-access-pfjfq\") pod \"community-operators-p8clc\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:27 crc kubenswrapper[4772]: I0930 18:57:27.142494 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:27 crc kubenswrapper[4772]: I0930 18:57:27.472977 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_607217cf-8f90-4adb-bca7-0271ea8a7b9b/setup-container/0.log" Sep 30 18:57:27 crc kubenswrapper[4772]: I0930 18:57:27.726102 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_607217cf-8f90-4adb-bca7-0271ea8a7b9b/setup-container/0.log" Sep 30 18:57:27 crc kubenswrapper[4772]: I0930 18:57:27.736998 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_607217cf-8f90-4adb-bca7-0271ea8a7b9b/rabbitmq/0.log" Sep 30 18:57:27 crc kubenswrapper[4772]: I0930 18:57:27.987171 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p8clc"] Sep 30 18:57:28 crc kubenswrapper[4772]: I0930 18:57:28.058406 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8clc" event={"ID":"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b","Type":"ContainerStarted","Data":"a8f59cfec70c65c14e4b29fd17cb853d2ca5e1e1b558d43afb719ab36ae91d36"} Sep 30 18:57:28 crc kubenswrapper[4772]: I0930 18:57:28.058905 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cc65bd09-5d06-4b46-b8ca-c518e77acd9c/setup-container/0.log" Sep 30 18:57:28 crc kubenswrapper[4772]: I0930 18:57:28.333069 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cc65bd09-5d06-4b46-b8ca-c518e77acd9c/setup-container/0.log" Sep 30 18:57:28 crc kubenswrapper[4772]: I0930 18:57:28.390942 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cc65bd09-5d06-4b46-b8ca-c518e77acd9c/rabbitmq/0.log" Sep 30 18:57:28 crc kubenswrapper[4772]: I0930 18:57:28.647621 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk_267b3439-a782-4c26-b376-19d72ece7ea1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:28 crc kubenswrapper[4772]: I0930 18:57:28.892951 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v_489fcf90-05fb-484f-9cd9-6b403023229a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:29 crc kubenswrapper[4772]: I0930 18:57:29.073272 4772 generic.go:334] "Generic (PLEG): container finished" podID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerID="b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3" exitCode=0 Sep 30 18:57:29 crc kubenswrapper[4772]: I0930 18:57:29.073592 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8clc" event={"ID":"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b","Type":"ContainerDied","Data":"b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3"} Sep 30 18:57:29 crc kubenswrapper[4772]: I0930 18:57:29.076985 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:57:29 crc kubenswrapper[4772]: I0930 18:57:29.183466 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vpf6j_2d5bcedc-1eef-4301-ac7f-af49c51fc9f3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:29 crc kubenswrapper[4772]: I0930 18:57:29.451835 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vlcvm_525c2cee-edb7-4953-b0c9-6f08b4496be5/ssh-known-hosts-edpm-deployment/0.log" Sep 30 18:57:29 crc kubenswrapper[4772]: I0930 18:57:29.913381 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-k2jql_96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:30 crc kubenswrapper[4772]: I0930 18:57:30.271577 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_10f9355c-b2c3-4893-86db-91551575a21e/tempest-tests-tempest-tests-runner/0.log" Sep 30 18:57:30 crc kubenswrapper[4772]: I0930 18:57:30.763770 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8efe2257-d089-4e71-b8cf-e80ca250b5d4/test-operator-logs-container/0.log" Sep 30 18:57:31 crc kubenswrapper[4772]: I0930 18:57:31.094614 4772 generic.go:334] "Generic (PLEG): container finished" podID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerID="6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c" exitCode=0 Sep 30 18:57:31 crc kubenswrapper[4772]: I0930 18:57:31.094657 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8clc" event={"ID":"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b","Type":"ContainerDied","Data":"6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c"} Sep 30 18:57:31 crc kubenswrapper[4772]: I0930 18:57:31.209862 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7szsj_688b1ee3-fe2c-4d2d-917f-17510c9d980a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 18:57:32 crc kubenswrapper[4772]: I0930 18:57:32.877445 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_a145ab07-1aa7-42d9-9ff7-83f68417fa0e/watcher-api-log/0.log" Sep 30 18:57:33 crc kubenswrapper[4772]: I0930 18:57:33.117588 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8clc" event={"ID":"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b","Type":"ContainerStarted","Data":"c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70"} Sep 30 18:57:33 crc kubenswrapper[4772]: I0930 18:57:33.148482 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p8clc" podStartSLOduration=4.237716559 podStartE2EDuration="7.148465358s" podCreationTimestamp="2025-09-30 18:57:26 +0000 UTC" firstStartedPulling="2025-09-30 18:57:29.076722452 +0000 UTC m=+6949.983735283" lastFinishedPulling="2025-09-30 18:57:31.987471251 +0000 UTC m=+6952.894484082" observedRunningTime="2025-09-30 18:57:33.142203408 +0000 UTC m=+6954.049216239" watchObservedRunningTime="2025-09-30 18:57:33.148465358 +0000 UTC m=+6954.055478189" Sep 30 18:57:34 crc kubenswrapper[4772]: I0930 18:57:34.059636 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_bf22b9ce-256e-4ba4-95ba-53778c010876/watcher-applier/0.log" Sep 30 18:57:35 crc kubenswrapper[4772]: I0930 18:57:35.526395 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d0056c55-0e0c-4dc0-8739-4a6e05db35ea/memcached/0.log" Sep 30 18:57:35 crc kubenswrapper[4772]: I0930 18:57:35.806393 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_69f02322-0ff1-410e-8b46-dd3b5f909963/watcher-decision-engine/3.log" Sep 30 18:57:37 crc kubenswrapper[4772]: I0930 18:57:37.142658 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:37 crc kubenswrapper[4772]: I0930 18:57:37.144794 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:37 crc kubenswrapper[4772]: I0930 18:57:37.197477 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:37 crc kubenswrapper[4772]: I0930 18:57:37.251384 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:37 crc kubenswrapper[4772]: I0930 18:57:37.445513 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p8clc"] Sep 30 18:57:38 crc kubenswrapper[4772]: I0930 18:57:38.715681 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_a145ab07-1aa7-42d9-9ff7-83f68417fa0e/watcher-api/0.log" Sep 30 18:57:38 crc kubenswrapper[4772]: I0930 18:57:38.852421 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_69f02322-0ff1-410e-8b46-dd3b5f909963/watcher-decision-engine/4.log" Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.183859 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p8clc" podUID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerName="registry-server" containerID="cri-o://c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70" gracePeriod=2 Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.680114 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.832465 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-utilities\") pod \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.832690 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfjfq\" (UniqueName: \"kubernetes.io/projected/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-kube-api-access-pfjfq\") pod \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.832839 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-catalog-content\") pod \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\" (UID: \"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b\") " Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.833261 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-utilities" (OuterVolumeSpecName: "utilities") pod "c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" (UID: "c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.833878 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.855292 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-kube-api-access-pfjfq" (OuterVolumeSpecName: "kube-api-access-pfjfq") pod "c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" (UID: "c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b"). InnerVolumeSpecName "kube-api-access-pfjfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.894564 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" (UID: "c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.935533 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfjfq\" (UniqueName: \"kubernetes.io/projected/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-kube-api-access-pfjfq\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:39 crc kubenswrapper[4772]: I0930 18:57:39.935567 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.198505 4772 generic.go:334] "Generic (PLEG): container finished" podID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerID="c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70" exitCode=0 Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.198574 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8clc" Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.198574 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8clc" event={"ID":"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b","Type":"ContainerDied","Data":"c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70"} Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.199122 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8clc" event={"ID":"c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b","Type":"ContainerDied","Data":"a8f59cfec70c65c14e4b29fd17cb853d2ca5e1e1b558d43afb719ab36ae91d36"} Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.199150 4772 scope.go:117] "RemoveContainer" containerID="c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70" Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.238366 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p8clc"] Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.261475 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p8clc"] Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.276300 4772 scope.go:117] "RemoveContainer" containerID="6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c" Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.329012 4772 scope.go:117] "RemoveContainer" containerID="b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3" Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.370154 4772 scope.go:117] "RemoveContainer" containerID="c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70" Sep 30 18:57:40 crc kubenswrapper[4772]: E0930 18:57:40.372205 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70\": container with ID starting with c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70 not found: ID does not exist" containerID="c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70" Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.372244 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70"} err="failed to get container status \"c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70\": rpc error: code = NotFound desc = could not find container \"c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70\": container with ID starting with c54a8659ac57058c56247a293284c81d326f31bed50dd8640c4ea79038782f70 not found: ID does not exist" Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.372275 4772 scope.go:117] "RemoveContainer" containerID="6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c" Sep 30 18:57:40 crc kubenswrapper[4772]: E0930 18:57:40.375472 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c\": container with ID starting with 6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c not found: ID does not exist" containerID="6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c" Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.375503 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c"} err="failed to get container status \"6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c\": rpc error: code = NotFound desc = could not find container \"6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c\": container with ID starting with 6a0cd840ce4aed7de07ead641d51fd597552cc6c705cd2c81a1dab99187a4a7c not found: ID does not exist" Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.375526 4772 scope.go:117] "RemoveContainer" containerID="b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3" Sep 30 18:57:40 crc kubenswrapper[4772]: E0930 18:57:40.376142 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3\": container with ID starting with b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3 not found: ID does not exist" containerID="b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3" Sep 30 18:57:40 crc kubenswrapper[4772]: I0930 18:57:40.376300 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3"} err="failed to get container status \"b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3\": rpc error: code = NotFound desc = could not find container \"b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3\": container with ID starting with b6b62f89b95e7dd20b07c87e7c954ba31a3c996209dddc5175adcce31ba442a3 not found: ID does not exist" Sep 30 18:57:41 crc kubenswrapper[4772]: I0930 18:57:41.911675 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" path="/var/lib/kubelet/pods/c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b/volumes" Sep 30 18:57:44 crc kubenswrapper[4772]: E0930 18:57:44.780081 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice/crio-a8f59cfec70c65c14e4b29fd17cb853d2ca5e1e1b558d43afb719ab36ae91d36\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice\": RecentStats: unable to find data in memory cache]" Sep 30 18:57:55 crc kubenswrapper[4772]: E0930 18:57:55.072860 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice/crio-a8f59cfec70c65c14e4b29fd17cb853d2ca5e1e1b558d43afb719ab36ae91d36\": RecentStats: unable to find data in memory cache]" Sep 30 18:58:05 crc kubenswrapper[4772]: E0930 18:58:05.400288 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice/crio-a8f59cfec70c65c14e4b29fd17cb853d2ca5e1e1b558d43afb719ab36ae91d36\": RecentStats: unable to find data in memory cache]" Sep 30 18:58:08 crc kubenswrapper[4772]: I0930 18:58:08.552669 4772 generic.go:334] "Generic (PLEG): container finished" podID="ce5c708f-3988-4046-887b-e989203d6ab6" containerID="04a0425951b483ba5b11c16009cd48ea942500fa0de490211dc4c373b9b7a70b" exitCode=0 Sep 30 18:58:08 crc kubenswrapper[4772]: I0930 18:58:08.552827 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" event={"ID":"ce5c708f-3988-4046-887b-e989203d6ab6","Type":"ContainerDied","Data":"04a0425951b483ba5b11c16009cd48ea942500fa0de490211dc4c373b9b7a70b"} Sep 30 18:58:09 crc kubenswrapper[4772]: I0930 18:58:09.673132 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" Sep 30 18:58:09 crc kubenswrapper[4772]: I0930 18:58:09.738944 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cmjnr/crc-debug-nsnnz"] Sep 30 18:58:09 crc kubenswrapper[4772]: I0930 18:58:09.750400 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cmjnr/crc-debug-nsnnz"] Sep 30 18:58:09 crc kubenswrapper[4772]: I0930 18:58:09.752571 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg7lh\" (UniqueName: \"kubernetes.io/projected/ce5c708f-3988-4046-887b-e989203d6ab6-kube-api-access-lg7lh\") pod \"ce5c708f-3988-4046-887b-e989203d6ab6\" (UID: \"ce5c708f-3988-4046-887b-e989203d6ab6\") " Sep 30 18:58:09 crc kubenswrapper[4772]: I0930 18:58:09.752947 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce5c708f-3988-4046-887b-e989203d6ab6-host\") pod \"ce5c708f-3988-4046-887b-e989203d6ab6\" (UID: \"ce5c708f-3988-4046-887b-e989203d6ab6\") " Sep 30 18:58:09 crc kubenswrapper[4772]: I0930 18:58:09.753117 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce5c708f-3988-4046-887b-e989203d6ab6-host" (OuterVolumeSpecName: "host") pod "ce5c708f-3988-4046-887b-e989203d6ab6" (UID: "ce5c708f-3988-4046-887b-e989203d6ab6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:58:09 crc kubenswrapper[4772]: I0930 18:58:09.753536 4772 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce5c708f-3988-4046-887b-e989203d6ab6-host\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:09 crc kubenswrapper[4772]: I0930 18:58:09.762998 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5c708f-3988-4046-887b-e989203d6ab6-kube-api-access-lg7lh" (OuterVolumeSpecName: "kube-api-access-lg7lh") pod "ce5c708f-3988-4046-887b-e989203d6ab6" (UID: "ce5c708f-3988-4046-887b-e989203d6ab6"). InnerVolumeSpecName "kube-api-access-lg7lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:58:09 crc kubenswrapper[4772]: I0930 18:58:09.856701 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg7lh\" (UniqueName: \"kubernetes.io/projected/ce5c708f-3988-4046-887b-e989203d6ab6-kube-api-access-lg7lh\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:09 crc kubenswrapper[4772]: I0930 18:58:09.912450 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce5c708f-3988-4046-887b-e989203d6ab6" path="/var/lib/kubelet/pods/ce5c708f-3988-4046-887b-e989203d6ab6/volumes" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.576782 4772 scope.go:117] "RemoveContainer" containerID="04a0425951b483ba5b11c16009cd48ea942500fa0de490211dc4c373b9b7a70b" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.576842 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-nsnnz" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.933634 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cmjnr/crc-debug-trgtp"] Sep 30 18:58:10 crc kubenswrapper[4772]: E0930 18:58:10.934183 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerName="registry-server" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.934200 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerName="registry-server" Sep 30 18:58:10 crc kubenswrapper[4772]: E0930 18:58:10.934214 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerName="extract-utilities" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.934221 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerName="extract-utilities" Sep 30 18:58:10 crc kubenswrapper[4772]: E0930 18:58:10.934244 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerName="extract-content" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.934251 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerName="extract-content" Sep 30 18:58:10 crc kubenswrapper[4772]: E0930 18:58:10.934267 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5c708f-3988-4046-887b-e989203d6ab6" containerName="container-00" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.934273 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5c708f-3988-4046-887b-e989203d6ab6" containerName="container-00" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.934467 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5c708f-3988-4046-887b-e989203d6ab6" containerName="container-00" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.934491 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24a9d5d-0724-4cb7-a1ba-e61569ad2a2b" containerName="registry-server" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.935302 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-trgtp" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.982140 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nln8f\" (UniqueName: \"kubernetes.io/projected/a301293c-d783-4cc1-9b97-f26f37d655e4-kube-api-access-nln8f\") pod \"crc-debug-trgtp\" (UID: \"a301293c-d783-4cc1-9b97-f26f37d655e4\") " pod="openshift-must-gather-cmjnr/crc-debug-trgtp" Sep 30 18:58:10 crc kubenswrapper[4772]: I0930 18:58:10.983214 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a301293c-d783-4cc1-9b97-f26f37d655e4-host\") pod \"crc-debug-trgtp\" (UID: \"a301293c-d783-4cc1-9b97-f26f37d655e4\") " pod="openshift-must-gather-cmjnr/crc-debug-trgtp" Sep 30 18:58:11 crc kubenswrapper[4772]: I0930 18:58:11.085938 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a301293c-d783-4cc1-9b97-f26f37d655e4-host\") pod \"crc-debug-trgtp\" (UID: \"a301293c-d783-4cc1-9b97-f26f37d655e4\") " pod="openshift-must-gather-cmjnr/crc-debug-trgtp" Sep 30 18:58:11 crc kubenswrapper[4772]: I0930 18:58:11.086087 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a301293c-d783-4cc1-9b97-f26f37d655e4-host\") pod \"crc-debug-trgtp\" (UID: \"a301293c-d783-4cc1-9b97-f26f37d655e4\") " pod="openshift-must-gather-cmjnr/crc-debug-trgtp" Sep 30 18:58:11 crc kubenswrapper[4772]: I0930 18:58:11.086105 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nln8f\" (UniqueName: \"kubernetes.io/projected/a301293c-d783-4cc1-9b97-f26f37d655e4-kube-api-access-nln8f\") pod \"crc-debug-trgtp\" (UID: \"a301293c-d783-4cc1-9b97-f26f37d655e4\") " pod="openshift-must-gather-cmjnr/crc-debug-trgtp" Sep 30 18:58:11 crc kubenswrapper[4772]: I0930 18:58:11.103464 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nln8f\" (UniqueName: \"kubernetes.io/projected/a301293c-d783-4cc1-9b97-f26f37d655e4-kube-api-access-nln8f\") pod \"crc-debug-trgtp\" (UID: \"a301293c-d783-4cc1-9b97-f26f37d655e4\") " pod="openshift-must-gather-cmjnr/crc-debug-trgtp" Sep 30 18:58:11 crc kubenswrapper[4772]: I0930 18:58:11.252147 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-trgtp" Sep 30 18:58:11 crc kubenswrapper[4772]: W0930 18:58:11.309540 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda301293c_d783_4cc1_9b97_f26f37d655e4.slice/crio-f9bb8b854537c57c12e5ffe59585c41dee678fbbf8b91c89fc67916e7fc8db0d WatchSource:0}: Error finding container f9bb8b854537c57c12e5ffe59585c41dee678fbbf8b91c89fc67916e7fc8db0d: Status 404 returned error can't find the container with id f9bb8b854537c57c12e5ffe59585c41dee678fbbf8b91c89fc67916e7fc8db0d Sep 30 18:58:11 crc kubenswrapper[4772]: I0930 18:58:11.591694 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/crc-debug-trgtp" event={"ID":"a301293c-d783-4cc1-9b97-f26f37d655e4","Type":"ContainerStarted","Data":"9edcc30c573a25a36ab5123e1fc71203093223d6f12f69b2c4ac95f1d53bf0c4"} Sep 30 18:58:11 crc kubenswrapper[4772]: I0930 18:58:11.592218 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/crc-debug-trgtp" event={"ID":"a301293c-d783-4cc1-9b97-f26f37d655e4","Type":"ContainerStarted","Data":"f9bb8b854537c57c12e5ffe59585c41dee678fbbf8b91c89fc67916e7fc8db0d"} Sep 30 18:58:11 crc kubenswrapper[4772]: I0930 18:58:11.630562 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cmjnr/crc-debug-trgtp" podStartSLOduration=1.630534792 podStartE2EDuration="1.630534792s" podCreationTimestamp="2025-09-30 18:58:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:58:11.62182913 +0000 UTC m=+6992.528841971" watchObservedRunningTime="2025-09-30 18:58:11.630534792 +0000 UTC m=+6992.537547623" Sep 30 18:58:12 crc kubenswrapper[4772]: I0930 18:58:12.606166 4772 generic.go:334] "Generic (PLEG): container finished" podID="a301293c-d783-4cc1-9b97-f26f37d655e4" containerID="9edcc30c573a25a36ab5123e1fc71203093223d6f12f69b2c4ac95f1d53bf0c4" exitCode=0 Sep 30 18:58:12 crc kubenswrapper[4772]: I0930 18:58:12.606252 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/crc-debug-trgtp" event={"ID":"a301293c-d783-4cc1-9b97-f26f37d655e4","Type":"ContainerDied","Data":"9edcc30c573a25a36ab5123e1fc71203093223d6f12f69b2c4ac95f1d53bf0c4"} Sep 30 18:58:13 crc kubenswrapper[4772]: I0930 18:58:13.747670 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-trgtp" Sep 30 18:58:13 crc kubenswrapper[4772]: I0930 18:58:13.853722 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a301293c-d783-4cc1-9b97-f26f37d655e4-host\") pod \"a301293c-d783-4cc1-9b97-f26f37d655e4\" (UID: \"a301293c-d783-4cc1-9b97-f26f37d655e4\") " Sep 30 18:58:13 crc kubenswrapper[4772]: I0930 18:58:13.854248 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nln8f\" (UniqueName: \"kubernetes.io/projected/a301293c-d783-4cc1-9b97-f26f37d655e4-kube-api-access-nln8f\") pod \"a301293c-d783-4cc1-9b97-f26f37d655e4\" (UID: \"a301293c-d783-4cc1-9b97-f26f37d655e4\") " Sep 30 18:58:13 crc kubenswrapper[4772]: I0930 18:58:13.853825 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a301293c-d783-4cc1-9b97-f26f37d655e4-host" (OuterVolumeSpecName: "host") pod "a301293c-d783-4cc1-9b97-f26f37d655e4" (UID: "a301293c-d783-4cc1-9b97-f26f37d655e4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:58:13 crc kubenswrapper[4772]: I0930 18:58:13.855026 4772 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a301293c-d783-4cc1-9b97-f26f37d655e4-host\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:13 crc kubenswrapper[4772]: I0930 18:58:13.864406 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a301293c-d783-4cc1-9b97-f26f37d655e4-kube-api-access-nln8f" (OuterVolumeSpecName: "kube-api-access-nln8f") pod "a301293c-d783-4cc1-9b97-f26f37d655e4" (UID: "a301293c-d783-4cc1-9b97-f26f37d655e4"). InnerVolumeSpecName "kube-api-access-nln8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:58:13 crc kubenswrapper[4772]: I0930 18:58:13.957285 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nln8f\" (UniqueName: \"kubernetes.io/projected/a301293c-d783-4cc1-9b97-f26f37d655e4-kube-api-access-nln8f\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:14 crc kubenswrapper[4772]: I0930 18:58:14.636738 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/crc-debug-trgtp" event={"ID":"a301293c-d783-4cc1-9b97-f26f37d655e4","Type":"ContainerDied","Data":"f9bb8b854537c57c12e5ffe59585c41dee678fbbf8b91c89fc67916e7fc8db0d"} Sep 30 18:58:14 crc kubenswrapper[4772]: I0930 18:58:14.636796 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9bb8b854537c57c12e5ffe59585c41dee678fbbf8b91c89fc67916e7fc8db0d" Sep 30 18:58:14 crc kubenswrapper[4772]: I0930 18:58:14.636886 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-trgtp" Sep 30 18:58:15 crc kubenswrapper[4772]: E0930 18:58:15.681870 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice/crio-a8f59cfec70c65c14e4b29fd17cb853d2ca5e1e1b558d43afb719ab36ae91d36\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice\": RecentStats: unable to find data in memory cache]" Sep 30 18:58:22 crc kubenswrapper[4772]: I0930 18:58:22.163599 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cmjnr/crc-debug-trgtp"] Sep 30 18:58:22 crc kubenswrapper[4772]: I0930 18:58:22.172226 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cmjnr/crc-debug-trgtp"] Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.331698 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cmjnr/crc-debug-4xk47"] Sep 30 18:58:23 crc kubenswrapper[4772]: E0930 18:58:23.332678 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a301293c-d783-4cc1-9b97-f26f37d655e4" containerName="container-00" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.332694 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a301293c-d783-4cc1-9b97-f26f37d655e4" containerName="container-00" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.332921 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a301293c-d783-4cc1-9b97-f26f37d655e4" containerName="container-00" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.333821 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-4xk47" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.380898 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qx7x\" (UniqueName: \"kubernetes.io/projected/17db1011-fdd7-4270-b471-08b63e57fb5e-kube-api-access-2qx7x\") pod \"crc-debug-4xk47\" (UID: \"17db1011-fdd7-4270-b471-08b63e57fb5e\") " pod="openshift-must-gather-cmjnr/crc-debug-4xk47" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.381362 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17db1011-fdd7-4270-b471-08b63e57fb5e-host\") pod \"crc-debug-4xk47\" (UID: \"17db1011-fdd7-4270-b471-08b63e57fb5e\") " pod="openshift-must-gather-cmjnr/crc-debug-4xk47" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.483531 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17db1011-fdd7-4270-b471-08b63e57fb5e-host\") pod \"crc-debug-4xk47\" (UID: \"17db1011-fdd7-4270-b471-08b63e57fb5e\") " pod="openshift-must-gather-cmjnr/crc-debug-4xk47" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.483632 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17db1011-fdd7-4270-b471-08b63e57fb5e-host\") pod \"crc-debug-4xk47\" (UID: \"17db1011-fdd7-4270-b471-08b63e57fb5e\") " pod="openshift-must-gather-cmjnr/crc-debug-4xk47" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.483944 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qx7x\" (UniqueName: \"kubernetes.io/projected/17db1011-fdd7-4270-b471-08b63e57fb5e-kube-api-access-2qx7x\") pod \"crc-debug-4xk47\" (UID: \"17db1011-fdd7-4270-b471-08b63e57fb5e\") " pod="openshift-must-gather-cmjnr/crc-debug-4xk47" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.505534 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qx7x\" (UniqueName: \"kubernetes.io/projected/17db1011-fdd7-4270-b471-08b63e57fb5e-kube-api-access-2qx7x\") pod \"crc-debug-4xk47\" (UID: \"17db1011-fdd7-4270-b471-08b63e57fb5e\") " pod="openshift-must-gather-cmjnr/crc-debug-4xk47" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.653503 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-4xk47" Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.737045 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/crc-debug-4xk47" event={"ID":"17db1011-fdd7-4270-b471-08b63e57fb5e","Type":"ContainerStarted","Data":"664586251ce71b547773cc55e41bbd3eda616aa2e38ef8410ce6c04f8a276310"} Sep 30 18:58:23 crc kubenswrapper[4772]: I0930 18:58:23.916982 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a301293c-d783-4cc1-9b97-f26f37d655e4" path="/var/lib/kubelet/pods/a301293c-d783-4cc1-9b97-f26f37d655e4/volumes" Sep 30 18:58:24 crc kubenswrapper[4772]: I0930 18:58:24.750370 4772 generic.go:334] "Generic (PLEG): container finished" podID="17db1011-fdd7-4270-b471-08b63e57fb5e" containerID="0942c48090f1df48cf4376fe7324d4ca0ca109c1d772f13613a74a6c1e05b85e" exitCode=0 Sep 30 18:58:24 crc kubenswrapper[4772]: I0930 18:58:24.750552 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/crc-debug-4xk47" event={"ID":"17db1011-fdd7-4270-b471-08b63e57fb5e","Type":"ContainerDied","Data":"0942c48090f1df48cf4376fe7324d4ca0ca109c1d772f13613a74a6c1e05b85e"} Sep 30 18:58:24 crc kubenswrapper[4772]: I0930 18:58:24.801355 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cmjnr/crc-debug-4xk47"] Sep 30 18:58:24 crc kubenswrapper[4772]: I0930 18:58:24.812828 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cmjnr/crc-debug-4xk47"] Sep 30 18:58:25 crc kubenswrapper[4772]: I0930 18:58:25.952835 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-4xk47" Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.039693 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17db1011-fdd7-4270-b471-08b63e57fb5e-host\") pod \"17db1011-fdd7-4270-b471-08b63e57fb5e\" (UID: \"17db1011-fdd7-4270-b471-08b63e57fb5e\") " Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.039798 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qx7x\" (UniqueName: \"kubernetes.io/projected/17db1011-fdd7-4270-b471-08b63e57fb5e-kube-api-access-2qx7x\") pod \"17db1011-fdd7-4270-b471-08b63e57fb5e\" (UID: \"17db1011-fdd7-4270-b471-08b63e57fb5e\") " Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.040006 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17db1011-fdd7-4270-b471-08b63e57fb5e-host" (OuterVolumeSpecName: "host") pod "17db1011-fdd7-4270-b471-08b63e57fb5e" (UID: "17db1011-fdd7-4270-b471-08b63e57fb5e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.040488 4772 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17db1011-fdd7-4270-b471-08b63e57fb5e-host\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:26 crc kubenswrapper[4772]: E0930 18:58:26.041707 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice/crio-a8f59cfec70c65c14e4b29fd17cb853d2ca5e1e1b558d43afb719ab36ae91d36\": RecentStats: unable to find data in memory cache]" Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.047455 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17db1011-fdd7-4270-b471-08b63e57fb5e-kube-api-access-2qx7x" (OuterVolumeSpecName: "kube-api-access-2qx7x") pod "17db1011-fdd7-4270-b471-08b63e57fb5e" (UID: "17db1011-fdd7-4270-b471-08b63e57fb5e"). InnerVolumeSpecName "kube-api-access-2qx7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.142546 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qx7x\" (UniqueName: \"kubernetes.io/projected/17db1011-fdd7-4270-b471-08b63e57fb5e-kube-api-access-2qx7x\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.653510 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-tzz8m_80d5010e-a767-491b-bcb2-89272762a121/kube-rbac-proxy/0.log" Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.751018 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-tzz8m_80d5010e-a767-491b-bcb2-89272762a121/manager/0.log" Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.780919 4772 scope.go:117] "RemoveContainer" containerID="0942c48090f1df48cf4376fe7324d4ca0ca109c1d772f13613a74a6c1e05b85e" Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.780961 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/crc-debug-4xk47" Sep 30 18:58:26 crc kubenswrapper[4772]: I0930 18:58:26.936726 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-n7w7p_ad2965ed-ed78-4646-97ae-07cce49e8eb1/kube-rbac-proxy/0.log" Sep 30 18:58:27 crc kubenswrapper[4772]: I0930 18:58:27.033165 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-n7w7p_ad2965ed-ed78-4646-97ae-07cce49e8eb1/manager/0.log" Sep 30 18:58:27 crc kubenswrapper[4772]: I0930 18:58:27.153206 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/util/0.log" Sep 30 18:58:27 crc kubenswrapper[4772]: I0930 18:58:27.332259 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/pull/0.log" Sep 30 18:58:27 crc kubenswrapper[4772]: I0930 18:58:27.336529 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/util/0.log" Sep 30 18:58:27 crc kubenswrapper[4772]: I0930 18:58:27.343536 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/pull/0.log" Sep 30 18:58:27 crc kubenswrapper[4772]: I0930 18:58:27.545495 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/util/0.log" Sep 30 18:58:27 crc kubenswrapper[4772]: I0930 18:58:27.616849 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/extract/0.log" Sep 30 18:58:27 crc kubenswrapper[4772]: I0930 18:58:27.909655 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17db1011-fdd7-4270-b471-08b63e57fb5e" path="/var/lib/kubelet/pods/17db1011-fdd7-4270-b471-08b63e57fb5e/volumes" Sep 30 18:58:27 crc kubenswrapper[4772]: I0930 18:58:27.910417 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/pull/0.log" Sep 30 18:58:28 crc kubenswrapper[4772]: I0930 18:58:28.076868 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-rqv96_832335d3-7446-4879-8ec1-8f24d6d3708a/kube-rbac-proxy/0.log" Sep 30 18:58:28 crc kubenswrapper[4772]: I0930 18:58:28.161520 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-rqv96_832335d3-7446-4879-8ec1-8f24d6d3708a/manager/0.log" Sep 30 18:58:28 crc kubenswrapper[4772]: I0930 18:58:28.229691 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-ghtj2_314c8eb1-ee8d-405d-9bb6-a74de21c2f01/kube-rbac-proxy/0.log" Sep 30 18:58:28 crc kubenswrapper[4772]: I0930 18:58:28.397225 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-b82x8_27e94b49-6017-4790-af32-61cdb6c41f2c/kube-rbac-proxy/0.log" Sep 30 18:58:28 crc kubenswrapper[4772]: I0930 18:58:28.463373 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-ghtj2_314c8eb1-ee8d-405d-9bb6-a74de21c2f01/manager/0.log" Sep 30 18:58:28 crc kubenswrapper[4772]: I0930 18:58:28.502088 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-b82x8_27e94b49-6017-4790-af32-61cdb6c41f2c/manager/0.log" Sep 30 18:58:28 crc kubenswrapper[4772]: I0930 18:58:28.600606 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-dn2kt_b7ba1160-070d-4cc4-9c53-75817bd6141e/kube-rbac-proxy/0.log" Sep 30 18:58:28 crc kubenswrapper[4772]: I0930 18:58:28.715598 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-dn2kt_b7ba1160-070d-4cc4-9c53-75817bd6141e/manager/0.log" Sep 30 18:58:28 crc kubenswrapper[4772]: I0930 18:58:28.806497 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-shhhk_058eec37-9f59-4fc5-8fa3-c9595bf58300/kube-rbac-proxy/0.log" Sep 30 18:58:28 crc kubenswrapper[4772]: I0930 18:58:28.992750 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-vdhkv_d10d7495-42f5-4919-8985-99913d62ab28/manager/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.064468 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-vdhkv_d10d7495-42f5-4919-8985-99913d62ab28/kube-rbac-proxy/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.068403 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-shhhk_058eec37-9f59-4fc5-8fa3-c9595bf58300/manager/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.195355 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-56vtc_44aed112-2ebc-48b6-b3b4-9a47d2dafaa9/kube-rbac-proxy/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.313071 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-56vtc_44aed112-2ebc-48b6-b3b4-9a47d2dafaa9/manager/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.426077 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-2fx6p_c886af64-f9cc-4127-9d17-3007ae492d06/manager/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.439322 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-2fx6p_c886af64-f9cc-4127-9d17-3007ae492d06/kube-rbac-proxy/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.537395 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-vlpqf_1753608a-67af-4fa4-83f1-3f7d1623fc6b/kube-rbac-proxy/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.618172 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-vlpqf_1753608a-67af-4fa4-83f1-3f7d1623fc6b/manager/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.726279 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-smllw_51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc/kube-rbac-proxy/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.781724 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-smllw_51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc/manager/0.log" Sep 30 18:58:29 crc kubenswrapper[4772]: I0930 18:58:29.890565 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-z472f_df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff/kube-rbac-proxy/0.log" Sep 30 18:58:30 crc kubenswrapper[4772]: I0930 18:58:30.078193 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-z472f_df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff/manager/0.log" Sep 30 18:58:30 crc kubenswrapper[4772]: I0930 18:58:30.109585 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-kpr6v_13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36/kube-rbac-proxy/0.log" Sep 30 18:58:30 crc kubenswrapper[4772]: I0930 18:58:30.146291 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-kpr6v_13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36/manager/0.log" Sep 30 18:58:30 crc kubenswrapper[4772]: I0930 18:58:30.290094 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-dxbpz_69e18d49-1290-4440-a3c9-885352fa18c5/kube-rbac-proxy/0.log" Sep 30 18:58:30 crc kubenswrapper[4772]: I0930 18:58:30.336389 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-dxbpz_69e18d49-1290-4440-a3c9-885352fa18c5/manager/0.log" Sep 30 18:58:30 crc kubenswrapper[4772]: I0930 18:58:30.385359 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dd9b5767f-p4n9f_d5be7b91-f881-4cd5-878e-1d40a94a3a8d/kube-rbac-proxy/0.log" Sep 30 18:58:30 crc kubenswrapper[4772]: I0930 18:58:30.573487 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5959786844-tbxrx_6d89b985-cd07-43bc-9024-ff6ffd1adc45/kube-rbac-proxy/0.log" Sep 30 18:58:30 crc kubenswrapper[4772]: I0930 18:58:30.874022 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5959786844-tbxrx_6d89b985-cd07-43bc-9024-ff6ffd1adc45/operator/0.log" Sep 30 18:58:30 crc kubenswrapper[4772]: I0930 18:58:30.879016 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rg4rc_04896286-2a65-451e-8639-d0f12941e991/registry-server/0.log" Sep 30 18:58:31 crc kubenswrapper[4772]: I0930 18:58:31.009319 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-8xbv5_6c9f85e1-5df7-4943-9064-69af6e200e82/kube-rbac-proxy/0.log" Sep 30 18:58:31 crc kubenswrapper[4772]: I0930 18:58:31.214022 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-8xbv5_6c9f85e1-5df7-4943-9064-69af6e200e82/manager/0.log" Sep 30 18:58:31 crc kubenswrapper[4772]: I0930 18:58:31.238512 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-mfnlh_f3a0e5a3-c50e-48ce-801d-f7916210165b/kube-rbac-proxy/0.log" Sep 30 18:58:31 crc kubenswrapper[4772]: I0930 18:58:31.313715 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-mfnlh_f3a0e5a3-c50e-48ce-801d-f7916210165b/manager/0.log" Sep 30 18:58:31 crc kubenswrapper[4772]: I0930 18:58:31.555674 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-swgvc_1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f/operator/0.log" Sep 30 18:58:31 crc kubenswrapper[4772]: I0930 18:58:31.658498 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-6m9mb_f8af3992-c401-4dea-b5a5-92063a05384e/kube-rbac-proxy/0.log" Sep 30 18:58:31 crc kubenswrapper[4772]: I0930 18:58:31.803753 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-6m9mb_f8af3992-c401-4dea-b5a5-92063a05384e/manager/0.log" Sep 30 18:58:31 crc kubenswrapper[4772]: I0930 18:58:31.890828 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-b6np7_d4295a68-a2dc-4b0b-a577-bbd6448d3a70/kube-rbac-proxy/0.log" Sep 30 18:58:32 crc kubenswrapper[4772]: I0930 18:58:32.042644 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dd9b5767f-p4n9f_d5be7b91-f881-4cd5-878e-1d40a94a3a8d/manager/0.log" Sep 30 18:58:32 crc kubenswrapper[4772]: I0930 18:58:32.141715 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-xmwpp_5b10f12b-b24a-4cf6-b07b-7b3e811ccd30/kube-rbac-proxy/0.log" Sep 30 18:58:32 crc kubenswrapper[4772]: I0930 18:58:32.175332 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-xmwpp_5b10f12b-b24a-4cf6-b07b-7b3e811ccd30/manager/0.log" Sep 30 18:58:32 crc kubenswrapper[4772]: I0930 18:58:32.208713 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-b6np7_d4295a68-a2dc-4b0b-a577-bbd6448d3a70/manager/0.log" Sep 30 18:58:32 crc kubenswrapper[4772]: I0930 18:58:32.379134 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-86c75f6bd4-4fnzg_4fcd6b42-8644-41f5-bd3b-51184d34cd00/kube-rbac-proxy/0.log" Sep 30 18:58:32 crc kubenswrapper[4772]: I0930 18:58:32.463773 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-86c75f6bd4-4fnzg_4fcd6b42-8644-41f5-bd3b-51184d34cd00/manager/0.log" Sep 30 18:58:36 crc kubenswrapper[4772]: E0930 18:58:36.337198 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice/crio-a8f59cfec70c65c14e4b29fd17cb853d2ca5e1e1b558d43afb719ab36ae91d36\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9d5d_0724_4cb7_a1ba_e61569ad2a2b.slice\": RecentStats: unable to find data in memory cache]" Sep 30 18:58:51 crc kubenswrapper[4772]: I0930 18:58:51.165983 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gdmvr_0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9/control-plane-machine-set-operator/0.log" Sep 30 18:58:51 crc kubenswrapper[4772]: I0930 18:58:51.357483 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-klzl8_b023c669-cb19-4010-b9d7-120bdfff87bd/kube-rbac-proxy/0.log" Sep 30 18:58:51 crc kubenswrapper[4772]: I0930 18:58:51.401884 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-klzl8_b023c669-cb19-4010-b9d7-120bdfff87bd/machine-api-operator/0.log" Sep 30 18:59:04 crc kubenswrapper[4772]: I0930 18:59:04.370653 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qtth7_75f76096-4236-46b9-8e3b-9e6784362607/cert-manager-controller/0.log" Sep 30 18:59:04 crc kubenswrapper[4772]: I0930 18:59:04.550938 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-ws7dg_525413d1-592e-482c-a45a-0e88bfc94da5/cert-manager-cainjector/0.log" Sep 30 18:59:04 crc kubenswrapper[4772]: I0930 18:59:04.601942 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-lkvcc_9e9dcd73-971e-4f2f-869a-317159d2c9a5/cert-manager-webhook/0.log" Sep 30 18:59:08 crc kubenswrapper[4772]: I0930 18:59:08.655842 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:59:08 crc kubenswrapper[4772]: I0930 18:59:08.656747 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:59:17 crc kubenswrapper[4772]: I0930 18:59:17.764214 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-m6pgg_f076e40b-6b99-4a23-8235-c008e4a209c5/nmstate-console-plugin/0.log" Sep 30 18:59:18 crc kubenswrapper[4772]: I0930 18:59:18.067971 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-pf6qn_477c7640-d169-487f-a2d7-9164f8b26417/nmstate-handler/0.log" Sep 30 18:59:18 crc kubenswrapper[4772]: I0930 18:59:18.099819 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-sll7d_aed9b88e-7f1b-472d-a22c-ebf719c71f73/kube-rbac-proxy/0.log" Sep 30 18:59:18 crc kubenswrapper[4772]: I0930 18:59:18.144928 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-sll7d_aed9b88e-7f1b-472d-a22c-ebf719c71f73/nmstate-metrics/0.log" Sep 30 18:59:18 crc kubenswrapper[4772]: I0930 18:59:18.318982 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-qds72_67cdd39b-a0de-4d14-ba2f-2419b31983da/nmstate-operator/0.log" Sep 30 18:59:18 crc kubenswrapper[4772]: I0930 18:59:18.404563 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-kphv7_4febaade-1298-413f-8f68-ca4771613783/nmstate-webhook/0.log" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.593181 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tt6tl"] Sep 30 18:59:25 crc kubenswrapper[4772]: E0930 18:59:25.594107 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17db1011-fdd7-4270-b471-08b63e57fb5e" containerName="container-00" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.594119 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="17db1011-fdd7-4270-b471-08b63e57fb5e" containerName="container-00" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.594344 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="17db1011-fdd7-4270-b471-08b63e57fb5e" containerName="container-00" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.595779 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.620608 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tt6tl"] Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.710672 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-catalog-content\") pod \"redhat-marketplace-tt6tl\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.711029 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-utilities\") pod \"redhat-marketplace-tt6tl\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.711279 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhf42\" (UniqueName: \"kubernetes.io/projected/81226efe-3a68-4561-98dc-c9eaf07a1e85-kube-api-access-mhf42\") pod \"redhat-marketplace-tt6tl\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.813577 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-catalog-content\") pod \"redhat-marketplace-tt6tl\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.813640 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-utilities\") pod \"redhat-marketplace-tt6tl\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.813738 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhf42\" (UniqueName: \"kubernetes.io/projected/81226efe-3a68-4561-98dc-c9eaf07a1e85-kube-api-access-mhf42\") pod \"redhat-marketplace-tt6tl\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.814578 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-catalog-content\") pod \"redhat-marketplace-tt6tl\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.814670 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-utilities\") pod \"redhat-marketplace-tt6tl\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.856514 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhf42\" (UniqueName: \"kubernetes.io/projected/81226efe-3a68-4561-98dc-c9eaf07a1e85-kube-api-access-mhf42\") pod \"redhat-marketplace-tt6tl\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:25 crc kubenswrapper[4772]: I0930 18:59:25.917986 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:26 crc kubenswrapper[4772]: I0930 18:59:26.584418 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tt6tl"] Sep 30 18:59:27 crc kubenswrapper[4772]: I0930 18:59:27.495151 4772 generic.go:334] "Generic (PLEG): container finished" podID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerID="00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8" exitCode=0 Sep 30 18:59:27 crc kubenswrapper[4772]: I0930 18:59:27.495261 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tt6tl" event={"ID":"81226efe-3a68-4561-98dc-c9eaf07a1e85","Type":"ContainerDied","Data":"00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8"} Sep 30 18:59:27 crc kubenswrapper[4772]: I0930 18:59:27.495604 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tt6tl" event={"ID":"81226efe-3a68-4561-98dc-c9eaf07a1e85","Type":"ContainerStarted","Data":"b1af226263f9e143eefd104f42b2df0aebe253e406ecc6c5cbe8b0d25778b4fd"} Sep 30 18:59:29 crc kubenswrapper[4772]: I0930 18:59:29.520029 4772 generic.go:334] "Generic (PLEG): container finished" podID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerID="dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df" exitCode=0 Sep 30 18:59:29 crc kubenswrapper[4772]: I0930 18:59:29.520109 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tt6tl" event={"ID":"81226efe-3a68-4561-98dc-c9eaf07a1e85","Type":"ContainerDied","Data":"dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df"} Sep 30 18:59:30 crc kubenswrapper[4772]: I0930 18:59:30.536040 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tt6tl" event={"ID":"81226efe-3a68-4561-98dc-c9eaf07a1e85","Type":"ContainerStarted","Data":"7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e"} Sep 30 18:59:30 crc kubenswrapper[4772]: I0930 18:59:30.569210 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tt6tl" podStartSLOduration=3.112037774 podStartE2EDuration="5.56917899s" podCreationTimestamp="2025-09-30 18:59:25 +0000 UTC" firstStartedPulling="2025-09-30 18:59:27.498749164 +0000 UTC m=+7068.405762015" lastFinishedPulling="2025-09-30 18:59:29.9558904 +0000 UTC m=+7070.862903231" observedRunningTime="2025-09-30 18:59:30.561968956 +0000 UTC m=+7071.468981787" watchObservedRunningTime="2025-09-30 18:59:30.56917899 +0000 UTC m=+7071.476191821" Sep 30 18:59:34 crc kubenswrapper[4772]: I0930 18:59:34.922499 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-pbhmq_a7c392dd-0528-44c0-8fa6-85d8c33a4ac4/kube-rbac-proxy/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.098781 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-pbhmq_a7c392dd-0528-44c0-8fa6-85d8c33a4ac4/controller/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.186477 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-frr-files/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.374405 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-metrics/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.374776 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-reloader/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.404474 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-reloader/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.414191 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-frr-files/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.617600 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-reloader/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.664326 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-frr-files/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.664571 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-metrics/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.668639 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-metrics/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.821181 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-frr-files/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.875734 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-metrics/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.895548 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-reloader/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.900844 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/controller/0.log" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.918652 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.919186 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:35 crc kubenswrapper[4772]: I0930 18:59:35.978274 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:36 crc kubenswrapper[4772]: I0930 18:59:36.091570 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/frr-metrics/0.log" Sep 30 18:59:36 crc kubenswrapper[4772]: I0930 18:59:36.146710 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/kube-rbac-proxy-frr/0.log" Sep 30 18:59:36 crc kubenswrapper[4772]: I0930 18:59:36.175183 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/kube-rbac-proxy/0.log" Sep 30 18:59:36 crc kubenswrapper[4772]: I0930 18:59:36.363836 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/reloader/0.log" Sep 30 18:59:36 crc kubenswrapper[4772]: I0930 18:59:36.412333 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-q2g5l_90506d56-68ff-4821-9594-0bfaa2ef2b57/frr-k8s-webhook-server/0.log" Sep 30 18:59:36 crc kubenswrapper[4772]: I0930 18:59:36.646420 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:36 crc kubenswrapper[4772]: I0930 18:59:36.648386 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cbbfcbbd-w9mxx_d8b0a4f0-a6d9-46ff-9487-98fec1d43e07/manager/0.log" Sep 30 18:59:36 crc kubenswrapper[4772]: I0930 18:59:36.895050 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7f67d8696d-jl7tp_fe6335cc-f638-4411-85e6-bf6beea1f24f/webhook-server/0.log" Sep 30 18:59:36 crc kubenswrapper[4772]: I0930 18:59:36.901506 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kd7pn_d1d9e7ba-297f-4ef1-913a-afb210b83c2a/kube-rbac-proxy/0.log" Sep 30 18:59:37 crc kubenswrapper[4772]: I0930 18:59:37.383530 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tt6tl"] Sep 30 18:59:37 crc kubenswrapper[4772]: I0930 18:59:37.686692 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kd7pn_d1d9e7ba-297f-4ef1-913a-afb210b83c2a/speaker/0.log" Sep 30 18:59:37 crc kubenswrapper[4772]: I0930 18:59:37.971161 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/frr/0.log" Sep 30 18:59:38 crc kubenswrapper[4772]: I0930 18:59:38.623772 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tt6tl" podUID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerName="registry-server" containerID="cri-o://7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e" gracePeriod=2 Sep 30 18:59:38 crc kubenswrapper[4772]: I0930 18:59:38.655738 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:59:38 crc kubenswrapper[4772]: I0930 18:59:38.655796 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.126047 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.223636 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-catalog-content\") pod \"81226efe-3a68-4561-98dc-c9eaf07a1e85\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.223801 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhf42\" (UniqueName: \"kubernetes.io/projected/81226efe-3a68-4561-98dc-c9eaf07a1e85-kube-api-access-mhf42\") pod \"81226efe-3a68-4561-98dc-c9eaf07a1e85\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.223897 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-utilities\") pod \"81226efe-3a68-4561-98dc-c9eaf07a1e85\" (UID: \"81226efe-3a68-4561-98dc-c9eaf07a1e85\") " Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.225084 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-utilities" (OuterVolumeSpecName: "utilities") pod "81226efe-3a68-4561-98dc-c9eaf07a1e85" (UID: "81226efe-3a68-4561-98dc-c9eaf07a1e85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.230502 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81226efe-3a68-4561-98dc-c9eaf07a1e85-kube-api-access-mhf42" (OuterVolumeSpecName: "kube-api-access-mhf42") pod "81226efe-3a68-4561-98dc-c9eaf07a1e85" (UID: "81226efe-3a68-4561-98dc-c9eaf07a1e85"). InnerVolumeSpecName "kube-api-access-mhf42". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.238845 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81226efe-3a68-4561-98dc-c9eaf07a1e85" (UID: "81226efe-3a68-4561-98dc-c9eaf07a1e85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.327076 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.327124 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhf42\" (UniqueName: \"kubernetes.io/projected/81226efe-3a68-4561-98dc-c9eaf07a1e85-kube-api-access-mhf42\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.327139 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81226efe-3a68-4561-98dc-c9eaf07a1e85-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.633586 4772 generic.go:334] "Generic (PLEG): container finished" podID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerID="7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e" exitCode=0 Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.633630 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tt6tl" event={"ID":"81226efe-3a68-4561-98dc-c9eaf07a1e85","Type":"ContainerDied","Data":"7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e"} Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.633648 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tt6tl" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.633663 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tt6tl" event={"ID":"81226efe-3a68-4561-98dc-c9eaf07a1e85","Type":"ContainerDied","Data":"b1af226263f9e143eefd104f42b2df0aebe253e406ecc6c5cbe8b0d25778b4fd"} Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.633681 4772 scope.go:117] "RemoveContainer" containerID="7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.661592 4772 scope.go:117] "RemoveContainer" containerID="dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.678103 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tt6tl"] Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.689295 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tt6tl"] Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.698311 4772 scope.go:117] "RemoveContainer" containerID="00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.739664 4772 scope.go:117] "RemoveContainer" containerID="7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e" Sep 30 18:59:39 crc kubenswrapper[4772]: E0930 18:59:39.740646 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e\": container with ID starting with 7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e not found: ID does not exist" containerID="7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.740692 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e"} err="failed to get container status \"7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e\": rpc error: code = NotFound desc = could not find container \"7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e\": container with ID starting with 7455df807a8baf839ba415d11e51d076e876093bb355c4d44b11ce3737dfa86e not found: ID does not exist" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.740726 4772 scope.go:117] "RemoveContainer" containerID="dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df" Sep 30 18:59:39 crc kubenswrapper[4772]: E0930 18:59:39.741254 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df\": container with ID starting with dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df not found: ID does not exist" containerID="dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.741277 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df"} err="failed to get container status \"dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df\": rpc error: code = NotFound desc = could not find container \"dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df\": container with ID starting with dfefadcacdb4240f9a24e1bc0d7c2e8108e667240cef55da5a9c659415c255df not found: ID does not exist" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.741292 4772 scope.go:117] "RemoveContainer" containerID="00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8" Sep 30 18:59:39 crc kubenswrapper[4772]: E0930 18:59:39.741709 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8\": container with ID starting with 00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8 not found: ID does not exist" containerID="00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.741746 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8"} err="failed to get container status \"00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8\": rpc error: code = NotFound desc = could not find container \"00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8\": container with ID starting with 00de11d738e032836eefa8e3eef49475a22386367d0103abedb45eff9e10a2d8 not found: ID does not exist" Sep 30 18:59:39 crc kubenswrapper[4772]: I0930 18:59:39.932324 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81226efe-3a68-4561-98dc-c9eaf07a1e85" path="/var/lib/kubelet/pods/81226efe-3a68-4561-98dc-c9eaf07a1e85/volumes" Sep 30 18:59:49 crc kubenswrapper[4772]: I0930 18:59:49.266513 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/util/0.log" Sep 30 18:59:49 crc kubenswrapper[4772]: I0930 18:59:49.467896 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/util/0.log" Sep 30 18:59:49 crc kubenswrapper[4772]: I0930 18:59:49.520585 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/pull/0.log" Sep 30 18:59:49 crc kubenswrapper[4772]: I0930 18:59:49.569444 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/pull/0.log" Sep 30 18:59:49 crc kubenswrapper[4772]: I0930 18:59:49.746324 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/pull/0.log" Sep 30 18:59:49 crc kubenswrapper[4772]: I0930 18:59:49.804538 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/extract/0.log" Sep 30 18:59:49 crc kubenswrapper[4772]: I0930 18:59:49.808480 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/util/0.log" Sep 30 18:59:49 crc kubenswrapper[4772]: I0930 18:59:49.951484 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/util/0.log" Sep 30 18:59:50 crc kubenswrapper[4772]: I0930 18:59:50.158433 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/util/0.log" Sep 30 18:59:50 crc kubenswrapper[4772]: I0930 18:59:50.158963 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/pull/0.log" Sep 30 18:59:50 crc kubenswrapper[4772]: I0930 18:59:50.187193 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/pull/0.log" Sep 30 18:59:50 crc kubenswrapper[4772]: I0930 18:59:50.390098 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/extract/0.log" Sep 30 18:59:50 crc kubenswrapper[4772]: I0930 18:59:50.417617 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/util/0.log" Sep 30 18:59:50 crc kubenswrapper[4772]: I0930 18:59:50.449102 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/pull/0.log" Sep 30 18:59:50 crc kubenswrapper[4772]: I0930 18:59:50.582783 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-utilities/0.log" Sep 30 18:59:50 crc kubenswrapper[4772]: I0930 18:59:50.799394 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-content/0.log" Sep 30 18:59:50 crc kubenswrapper[4772]: I0930 18:59:50.799504 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-content/0.log" Sep 30 18:59:50 crc kubenswrapper[4772]: I0930 18:59:50.837181 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-utilities/0.log" Sep 30 18:59:51 crc kubenswrapper[4772]: I0930 18:59:51.035160 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-content/0.log" Sep 30 18:59:51 crc kubenswrapper[4772]: I0930 18:59:51.088442 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-utilities/0.log" Sep 30 18:59:51 crc kubenswrapper[4772]: I0930 18:59:51.274521 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-utilities/0.log" Sep 30 18:59:51 crc kubenswrapper[4772]: I0930 18:59:51.571322 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-content/0.log" Sep 30 18:59:51 crc kubenswrapper[4772]: I0930 18:59:51.603163 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-content/0.log" Sep 30 18:59:51 crc kubenswrapper[4772]: I0930 18:59:51.634008 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-utilities/0.log" Sep 30 18:59:51 crc kubenswrapper[4772]: I0930 18:59:51.859460 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-utilities/0.log" Sep 30 18:59:51 crc kubenswrapper[4772]: I0930 18:59:51.862005 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-content/0.log" Sep 30 18:59:52 crc kubenswrapper[4772]: I0930 18:59:52.092774 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/registry-server/0.log" Sep 30 18:59:52 crc kubenswrapper[4772]: I0930 18:59:52.174808 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/util/0.log" Sep 30 18:59:52 crc kubenswrapper[4772]: I0930 18:59:52.365305 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/util/0.log" Sep 30 18:59:52 crc kubenswrapper[4772]: I0930 18:59:52.458266 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/pull/0.log" Sep 30 18:59:52 crc kubenswrapper[4772]: I0930 18:59:52.524874 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/pull/0.log" Sep 30 18:59:52 crc kubenswrapper[4772]: I0930 18:59:52.648436 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/util/0.log" Sep 30 18:59:52 crc kubenswrapper[4772]: I0930 18:59:52.740878 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/pull/0.log" Sep 30 18:59:52 crc kubenswrapper[4772]: I0930 18:59:52.765313 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/extract/0.log" Sep 30 18:59:52 crc kubenswrapper[4772]: I0930 18:59:52.884466 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8qmk7_aae4ed0a-da1e-4581-913e-1c3c8c1554cc/marketplace-operator/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.026789 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-utilities/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.110722 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/registry-server/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.214926 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-utilities/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.235600 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-content/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.274077 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-content/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.447915 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-utilities/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.487555 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-utilities/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.510226 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-content/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.726204 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/registry-server/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.747175 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-utilities/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.761345 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-content/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.816773 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-content/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.969274 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-content/0.log" Sep 30 18:59:53 crc kubenswrapper[4772]: I0930 18:59:53.999343 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-utilities/0.log" Sep 30 18:59:54 crc kubenswrapper[4772]: I0930 18:59:54.836555 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/registry-server/0.log" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.174129 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4"] Sep 30 19:00:00 crc kubenswrapper[4772]: E0930 19:00:00.175197 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerName="extract-content" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.175214 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerName="extract-content" Sep 30 19:00:00 crc kubenswrapper[4772]: E0930 19:00:00.175232 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerName="registry-server" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.175238 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerName="registry-server" Sep 30 19:00:00 crc kubenswrapper[4772]: E0930 19:00:00.175252 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerName="extract-utilities" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.175259 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerName="extract-utilities" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.175448 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="81226efe-3a68-4561-98dc-c9eaf07a1e85" containerName="registry-server" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.176246 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.179019 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.185028 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.207495 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xb6g\" (UniqueName: \"kubernetes.io/projected/f67b285c-ce18-4a76-bc7d-fcde52ec050e-kube-api-access-2xb6g\") pod \"collect-profiles-29320980-9chh4\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.207564 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f67b285c-ce18-4a76-bc7d-fcde52ec050e-secret-volume\") pod \"collect-profiles-29320980-9chh4\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.207620 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f67b285c-ce18-4a76-bc7d-fcde52ec050e-config-volume\") pod \"collect-profiles-29320980-9chh4\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.219733 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4"] Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.310196 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xb6g\" (UniqueName: \"kubernetes.io/projected/f67b285c-ce18-4a76-bc7d-fcde52ec050e-kube-api-access-2xb6g\") pod \"collect-profiles-29320980-9chh4\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.310287 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f67b285c-ce18-4a76-bc7d-fcde52ec050e-secret-volume\") pod \"collect-profiles-29320980-9chh4\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.310352 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f67b285c-ce18-4a76-bc7d-fcde52ec050e-config-volume\") pod \"collect-profiles-29320980-9chh4\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.311737 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f67b285c-ce18-4a76-bc7d-fcde52ec050e-config-volume\") pod \"collect-profiles-29320980-9chh4\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.323928 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f67b285c-ce18-4a76-bc7d-fcde52ec050e-secret-volume\") pod \"collect-profiles-29320980-9chh4\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.328888 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xb6g\" (UniqueName: \"kubernetes.io/projected/f67b285c-ce18-4a76-bc7d-fcde52ec050e-kube-api-access-2xb6g\") pod \"collect-profiles-29320980-9chh4\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:00 crc kubenswrapper[4772]: I0930 19:00:00.503821 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:01 crc kubenswrapper[4772]: I0930 19:00:01.020724 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4"] Sep 30 19:00:01 crc kubenswrapper[4772]: I0930 19:00:01.882268 4772 generic.go:334] "Generic (PLEG): container finished" podID="f67b285c-ce18-4a76-bc7d-fcde52ec050e" containerID="e6d9dae6d7386e0c8a514af820bd1e909491c16ac3130149648646526e15c397" exitCode=0 Sep 30 19:00:01 crc kubenswrapper[4772]: I0930 19:00:01.882359 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" event={"ID":"f67b285c-ce18-4a76-bc7d-fcde52ec050e","Type":"ContainerDied","Data":"e6d9dae6d7386e0c8a514af820bd1e909491c16ac3130149648646526e15c397"} Sep 30 19:00:01 crc kubenswrapper[4772]: I0930 19:00:01.882673 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" event={"ID":"f67b285c-ce18-4a76-bc7d-fcde52ec050e","Type":"ContainerStarted","Data":"284b2980f05862e42b5cc826f1e4cfef33dddffdf81263c17c951bae5567455f"} Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.299533 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.392085 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f67b285c-ce18-4a76-bc7d-fcde52ec050e-secret-volume\") pod \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.392623 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xb6g\" (UniqueName: \"kubernetes.io/projected/f67b285c-ce18-4a76-bc7d-fcde52ec050e-kube-api-access-2xb6g\") pod \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.392681 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f67b285c-ce18-4a76-bc7d-fcde52ec050e-config-volume\") pod \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\" (UID: \"f67b285c-ce18-4a76-bc7d-fcde52ec050e\") " Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.393621 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f67b285c-ce18-4a76-bc7d-fcde52ec050e-config-volume" (OuterVolumeSpecName: "config-volume") pod "f67b285c-ce18-4a76-bc7d-fcde52ec050e" (UID: "f67b285c-ce18-4a76-bc7d-fcde52ec050e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.401103 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67b285c-ce18-4a76-bc7d-fcde52ec050e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f67b285c-ce18-4a76-bc7d-fcde52ec050e" (UID: "f67b285c-ce18-4a76-bc7d-fcde52ec050e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.401403 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f67b285c-ce18-4a76-bc7d-fcde52ec050e-kube-api-access-2xb6g" (OuterVolumeSpecName: "kube-api-access-2xb6g") pod "f67b285c-ce18-4a76-bc7d-fcde52ec050e" (UID: "f67b285c-ce18-4a76-bc7d-fcde52ec050e"). InnerVolumeSpecName "kube-api-access-2xb6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.495745 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xb6g\" (UniqueName: \"kubernetes.io/projected/f67b285c-ce18-4a76-bc7d-fcde52ec050e-kube-api-access-2xb6g\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.495799 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f67b285c-ce18-4a76-bc7d-fcde52ec050e-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.495820 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f67b285c-ce18-4a76-bc7d-fcde52ec050e-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.918509 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" event={"ID":"f67b285c-ce18-4a76-bc7d-fcde52ec050e","Type":"ContainerDied","Data":"284b2980f05862e42b5cc826f1e4cfef33dddffdf81263c17c951bae5567455f"} Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.918586 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="284b2980f05862e42b5cc826f1e4cfef33dddffdf81263c17c951bae5567455f" Sep 30 19:00:03 crc kubenswrapper[4772]: I0930 19:00:03.918612 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-9chh4" Sep 30 19:00:04 crc kubenswrapper[4772]: I0930 19:00:04.413221 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl"] Sep 30 19:00:04 crc kubenswrapper[4772]: I0930 19:00:04.443302 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320935-9c2gl"] Sep 30 19:00:05 crc kubenswrapper[4772]: I0930 19:00:05.913438 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c8106c-b8cc-42be-b81f-99bc7d42d7cc" path="/var/lib/kubelet/pods/a5c8106c-b8cc-42be-b81f-99bc7d42d7cc/volumes" Sep 30 19:00:07 crc kubenswrapper[4772]: I0930 19:00:07.260027 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-qkggk_3cb2995b-6088-4762-8e3b-d99d0eaf03ed/prometheus-operator/0.log" Sep 30 19:00:07 crc kubenswrapper[4772]: I0930 19:00:07.462576 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b79d4788f-drcw5_167ceeed-fcd1-409a-b655-f17da9529300/prometheus-operator-admission-webhook/0.log" Sep 30 19:00:07 crc kubenswrapper[4772]: I0930 19:00:07.516963 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn_9acc0016-89fe-4a76-a443-b19b593dc666/prometheus-operator-admission-webhook/0.log" Sep 30 19:00:07 crc kubenswrapper[4772]: I0930 19:00:07.673859 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-zm69t_4ce80066-009d-4bb9-8a33-dcb521b0e08c/operator/0.log" Sep 30 19:00:07 crc kubenswrapper[4772]: I0930 19:00:07.734864 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-b9gwr_28b404db-1018-43c7-bdba-e2b0d97e1a8c/perses-operator/0.log" Sep 30 19:00:08 crc kubenswrapper[4772]: I0930 19:00:08.655599 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:00:08 crc kubenswrapper[4772]: I0930 19:00:08.656107 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:00:08 crc kubenswrapper[4772]: I0930 19:00:08.656180 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 19:00:08 crc kubenswrapper[4772]: I0930 19:00:08.657137 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77c9607ed2a0e38d68bcdc15808423e68f6c0324df585a7ce24dade791feb774"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:00:08 crc kubenswrapper[4772]: I0930 19:00:08.657214 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://77c9607ed2a0e38d68bcdc15808423e68f6c0324df585a7ce24dade791feb774" gracePeriod=600 Sep 30 19:00:08 crc kubenswrapper[4772]: I0930 19:00:08.997880 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="77c9607ed2a0e38d68bcdc15808423e68f6c0324df585a7ce24dade791feb774" exitCode=0 Sep 30 19:00:08 crc kubenswrapper[4772]: I0930 19:00:08.998268 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"77c9607ed2a0e38d68bcdc15808423e68f6c0324df585a7ce24dade791feb774"} Sep 30 19:00:08 crc kubenswrapper[4772]: I0930 19:00:08.998406 4772 scope.go:117] "RemoveContainer" containerID="71574bfec148c485ad59b296fcf13fe057414f8268c5733bd2414143057ede2d" Sep 30 19:00:10 crc kubenswrapper[4772]: I0930 19:00:10.011106 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203"} Sep 30 19:00:48 crc kubenswrapper[4772]: I0930 19:00:48.224009 4772 scope.go:117] "RemoveContainer" containerID="2f12b2be0082f006cc14ad0563b4f27ed5cf06bac043f549eab4587b1a6a6a0d" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.157583 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320981-lpckp"] Sep 30 19:01:00 crc kubenswrapper[4772]: E0930 19:01:00.158555 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67b285c-ce18-4a76-bc7d-fcde52ec050e" containerName="collect-profiles" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.158567 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67b285c-ce18-4a76-bc7d-fcde52ec050e" containerName="collect-profiles" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.158762 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67b285c-ce18-4a76-bc7d-fcde52ec050e" containerName="collect-profiles" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.159573 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.163800 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-config-data\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.163927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-combined-ca-bundle\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.164024 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-427c9\" (UniqueName: \"kubernetes.io/projected/c1e6427c-23e2-45dd-897f-127c199eecbf-kube-api-access-427c9\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.164146 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-fernet-keys\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.173684 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320981-lpckp"] Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.265729 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-fernet-keys\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.266289 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-config-data\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.266511 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-combined-ca-bundle\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.266637 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-427c9\" (UniqueName: \"kubernetes.io/projected/c1e6427c-23e2-45dd-897f-127c199eecbf-kube-api-access-427c9\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.273098 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-combined-ca-bundle\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.274163 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-fernet-keys\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.275831 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-config-data\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.283473 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-427c9\" (UniqueName: \"kubernetes.io/projected/c1e6427c-23e2-45dd-897f-127c199eecbf-kube-api-access-427c9\") pod \"keystone-cron-29320981-lpckp\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.485026 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:00 crc kubenswrapper[4772]: I0930 19:01:00.939569 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320981-lpckp"] Sep 30 19:01:01 crc kubenswrapper[4772]: I0930 19:01:01.619183 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320981-lpckp" event={"ID":"c1e6427c-23e2-45dd-897f-127c199eecbf","Type":"ContainerStarted","Data":"a32eb4daf62a1116018c93752bcaad60a54d3c7c88d3fa97c2760e04a2e7c765"} Sep 30 19:01:01 crc kubenswrapper[4772]: I0930 19:01:01.621509 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320981-lpckp" event={"ID":"c1e6427c-23e2-45dd-897f-127c199eecbf","Type":"ContainerStarted","Data":"e6519387eab255558e867579574e5162cc7c336c1dde44d8e622ff416bbf29f5"} Sep 30 19:01:01 crc kubenswrapper[4772]: I0930 19:01:01.647044 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320981-lpckp" podStartSLOduration=1.6470166339999999 podStartE2EDuration="1.647016634s" podCreationTimestamp="2025-09-30 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:01:01.636542472 +0000 UTC m=+7162.543555303" watchObservedRunningTime="2025-09-30 19:01:01.647016634 +0000 UTC m=+7162.554029465" Sep 30 19:01:05 crc kubenswrapper[4772]: I0930 19:01:05.669898 4772 generic.go:334] "Generic (PLEG): container finished" podID="c1e6427c-23e2-45dd-897f-127c199eecbf" containerID="a32eb4daf62a1116018c93752bcaad60a54d3c7c88d3fa97c2760e04a2e7c765" exitCode=0 Sep 30 19:01:05 crc kubenswrapper[4772]: I0930 19:01:05.670007 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320981-lpckp" event={"ID":"c1e6427c-23e2-45dd-897f-127c199eecbf","Type":"ContainerDied","Data":"a32eb4daf62a1116018c93752bcaad60a54d3c7c88d3fa97c2760e04a2e7c765"} Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.108826 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.185048 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-combined-ca-bundle\") pod \"c1e6427c-23e2-45dd-897f-127c199eecbf\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.185308 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-config-data\") pod \"c1e6427c-23e2-45dd-897f-127c199eecbf\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.185342 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-fernet-keys\") pod \"c1e6427c-23e2-45dd-897f-127c199eecbf\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.185643 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-427c9\" (UniqueName: \"kubernetes.io/projected/c1e6427c-23e2-45dd-897f-127c199eecbf-kube-api-access-427c9\") pod \"c1e6427c-23e2-45dd-897f-127c199eecbf\" (UID: \"c1e6427c-23e2-45dd-897f-127c199eecbf\") " Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.207280 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c1e6427c-23e2-45dd-897f-127c199eecbf" (UID: "c1e6427c-23e2-45dd-897f-127c199eecbf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.214597 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e6427c-23e2-45dd-897f-127c199eecbf-kube-api-access-427c9" (OuterVolumeSpecName: "kube-api-access-427c9") pod "c1e6427c-23e2-45dd-897f-127c199eecbf" (UID: "c1e6427c-23e2-45dd-897f-127c199eecbf"). InnerVolumeSpecName "kube-api-access-427c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.227961 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1e6427c-23e2-45dd-897f-127c199eecbf" (UID: "c1e6427c-23e2-45dd-897f-127c199eecbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.255848 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-config-data" (OuterVolumeSpecName: "config-data") pod "c1e6427c-23e2-45dd-897f-127c199eecbf" (UID: "c1e6427c-23e2-45dd-897f-127c199eecbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.289244 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.289301 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.289317 4772 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1e6427c-23e2-45dd-897f-127c199eecbf-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.289334 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-427c9\" (UniqueName: \"kubernetes.io/projected/c1e6427c-23e2-45dd-897f-127c199eecbf-kube-api-access-427c9\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.699781 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320981-lpckp" event={"ID":"c1e6427c-23e2-45dd-897f-127c199eecbf","Type":"ContainerDied","Data":"e6519387eab255558e867579574e5162cc7c336c1dde44d8e622ff416bbf29f5"} Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.700634 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6519387eab255558e867579574e5162cc7c336c1dde44d8e622ff416bbf29f5" Sep 30 19:01:07 crc kubenswrapper[4772]: I0930 19:01:07.699819 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320981-lpckp" Sep 30 19:02:38 crc kubenswrapper[4772]: I0930 19:02:38.655866 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:02:38 crc kubenswrapper[4772]: I0930 19:02:38.656823 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.222419 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jbkws"] Sep 30 19:02:49 crc kubenswrapper[4772]: E0930 19:02:49.235938 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e6427c-23e2-45dd-897f-127c199eecbf" containerName="keystone-cron" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.235976 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e6427c-23e2-45dd-897f-127c199eecbf" containerName="keystone-cron" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.236402 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e6427c-23e2-45dd-897f-127c199eecbf" containerName="keystone-cron" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.239250 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jbkws"] Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.239410 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.419901 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqzgm\" (UniqueName: \"kubernetes.io/projected/2aac0568-fd9a-430b-b0de-b0692e5f851b-kube-api-access-nqzgm\") pod \"certified-operators-jbkws\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.420120 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-catalog-content\") pod \"certified-operators-jbkws\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.420479 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-utilities\") pod \"certified-operators-jbkws\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.523129 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-utilities\") pod \"certified-operators-jbkws\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.523427 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqzgm\" (UniqueName: \"kubernetes.io/projected/2aac0568-fd9a-430b-b0de-b0692e5f851b-kube-api-access-nqzgm\") pod \"certified-operators-jbkws\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.523520 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-catalog-content\") pod \"certified-operators-jbkws\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.524203 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-catalog-content\") pod \"certified-operators-jbkws\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.524382 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-utilities\") pod \"certified-operators-jbkws\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.550083 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqzgm\" (UniqueName: \"kubernetes.io/projected/2aac0568-fd9a-430b-b0de-b0692e5f851b-kube-api-access-nqzgm\") pod \"certified-operators-jbkws\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:49 crc kubenswrapper[4772]: I0930 19:02:49.608918 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:50 crc kubenswrapper[4772]: I0930 19:02:50.750312 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jbkws"] Sep 30 19:02:50 crc kubenswrapper[4772]: W0930 19:02:50.763553 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aac0568_fd9a_430b_b0de_b0692e5f851b.slice/crio-c54c2c72a0535eb88aa54cf0eb5bdc2389066972c05fcfec68c5484759965a74 WatchSource:0}: Error finding container c54c2c72a0535eb88aa54cf0eb5bdc2389066972c05fcfec68c5484759965a74: Status 404 returned error can't find the container with id c54c2c72a0535eb88aa54cf0eb5bdc2389066972c05fcfec68c5484759965a74 Sep 30 19:02:50 crc kubenswrapper[4772]: I0930 19:02:50.939996 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbkws" event={"ID":"2aac0568-fd9a-430b-b0de-b0692e5f851b","Type":"ContainerStarted","Data":"c54c2c72a0535eb88aa54cf0eb5bdc2389066972c05fcfec68c5484759965a74"} Sep 30 19:02:51 crc kubenswrapper[4772]: I0930 19:02:51.951669 4772 generic.go:334] "Generic (PLEG): container finished" podID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerID="9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1" exitCode=0 Sep 30 19:02:51 crc kubenswrapper[4772]: I0930 19:02:51.952181 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbkws" event={"ID":"2aac0568-fd9a-430b-b0de-b0692e5f851b","Type":"ContainerDied","Data":"9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1"} Sep 30 19:02:51 crc kubenswrapper[4772]: I0930 19:02:51.954533 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:02:52 crc kubenswrapper[4772]: I0930 19:02:52.966389 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbkws" event={"ID":"2aac0568-fd9a-430b-b0de-b0692e5f851b","Type":"ContainerStarted","Data":"e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be"} Sep 30 19:02:53 crc kubenswrapper[4772]: I0930 19:02:53.982468 4772 generic.go:334] "Generic (PLEG): container finished" podID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerID="e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be" exitCode=0 Sep 30 19:02:53 crc kubenswrapper[4772]: I0930 19:02:53.983175 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbkws" event={"ID":"2aac0568-fd9a-430b-b0de-b0692e5f851b","Type":"ContainerDied","Data":"e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be"} Sep 30 19:02:55 crc kubenswrapper[4772]: I0930 19:02:55.012670 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbkws" event={"ID":"2aac0568-fd9a-430b-b0de-b0692e5f851b","Type":"ContainerStarted","Data":"2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb"} Sep 30 19:02:55 crc kubenswrapper[4772]: I0930 19:02:55.051950 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jbkws" podStartSLOduration=3.646715397 podStartE2EDuration="6.051929165s" podCreationTimestamp="2025-09-30 19:02:49 +0000 UTC" firstStartedPulling="2025-09-30 19:02:51.954158898 +0000 UTC m=+7272.861171729" lastFinishedPulling="2025-09-30 19:02:54.359372666 +0000 UTC m=+7275.266385497" observedRunningTime="2025-09-30 19:02:55.039618095 +0000 UTC m=+7275.946630916" watchObservedRunningTime="2025-09-30 19:02:55.051929165 +0000 UTC m=+7275.958941996" Sep 30 19:02:59 crc kubenswrapper[4772]: I0930 19:02:59.609816 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:59 crc kubenswrapper[4772]: I0930 19:02:59.611836 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:02:59 crc kubenswrapper[4772]: I0930 19:02:59.676907 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:03:00 crc kubenswrapper[4772]: I0930 19:03:00.132890 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:03:00 crc kubenswrapper[4772]: I0930 19:03:00.187313 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jbkws"] Sep 30 19:03:02 crc kubenswrapper[4772]: I0930 19:03:02.097012 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jbkws" podUID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerName="registry-server" containerID="cri-o://2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb" gracePeriod=2 Sep 30 19:03:02 crc kubenswrapper[4772]: I0930 19:03:02.646463 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:03:02 crc kubenswrapper[4772]: I0930 19:03:02.790836 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-catalog-content\") pod \"2aac0568-fd9a-430b-b0de-b0692e5f851b\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " Sep 30 19:03:02 crc kubenswrapper[4772]: I0930 19:03:02.791278 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqzgm\" (UniqueName: \"kubernetes.io/projected/2aac0568-fd9a-430b-b0de-b0692e5f851b-kube-api-access-nqzgm\") pod \"2aac0568-fd9a-430b-b0de-b0692e5f851b\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " Sep 30 19:03:02 crc kubenswrapper[4772]: I0930 19:03:02.791383 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-utilities\") pod \"2aac0568-fd9a-430b-b0de-b0692e5f851b\" (UID: \"2aac0568-fd9a-430b-b0de-b0692e5f851b\") " Sep 30 19:03:02 crc kubenswrapper[4772]: I0930 19:03:02.792297 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-utilities" (OuterVolumeSpecName: "utilities") pod "2aac0568-fd9a-430b-b0de-b0692e5f851b" (UID: "2aac0568-fd9a-430b-b0de-b0692e5f851b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:03:02 crc kubenswrapper[4772]: I0930 19:03:02.803596 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aac0568-fd9a-430b-b0de-b0692e5f851b-kube-api-access-nqzgm" (OuterVolumeSpecName: "kube-api-access-nqzgm") pod "2aac0568-fd9a-430b-b0de-b0692e5f851b" (UID: "2aac0568-fd9a-430b-b0de-b0692e5f851b"). InnerVolumeSpecName "kube-api-access-nqzgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:03:02 crc kubenswrapper[4772]: I0930 19:03:02.894720 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqzgm\" (UniqueName: \"kubernetes.io/projected/2aac0568-fd9a-430b-b0de-b0692e5f851b-kube-api-access-nqzgm\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:02 crc kubenswrapper[4772]: I0930 19:03:02.894771 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.115840 4772 generic.go:334] "Generic (PLEG): container finished" podID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerID="2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb" exitCode=0 Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.115916 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbkws" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.115923 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbkws" event={"ID":"2aac0568-fd9a-430b-b0de-b0692e5f851b","Type":"ContainerDied","Data":"2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb"} Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.116702 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbkws" event={"ID":"2aac0568-fd9a-430b-b0de-b0692e5f851b","Type":"ContainerDied","Data":"c54c2c72a0535eb88aa54cf0eb5bdc2389066972c05fcfec68c5484759965a74"} Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.116771 4772 scope.go:117] "RemoveContainer" containerID="2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.142771 4772 scope.go:117] "RemoveContainer" containerID="e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.170122 4772 scope.go:117] "RemoveContainer" containerID="9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.222319 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2aac0568-fd9a-430b-b0de-b0692e5f851b" (UID: "2aac0568-fd9a-430b-b0de-b0692e5f851b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.225638 4772 scope.go:117] "RemoveContainer" containerID="2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb" Sep 30 19:03:03 crc kubenswrapper[4772]: E0930 19:03:03.226655 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb\": container with ID starting with 2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb not found: ID does not exist" containerID="2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.226747 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb"} err="failed to get container status \"2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb\": rpc error: code = NotFound desc = could not find container \"2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb\": container with ID starting with 2f9e8a872611bfaccbe7557046d09b61360331ecd7424fc086e2e63ddb0028cb not found: ID does not exist" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.226808 4772 scope.go:117] "RemoveContainer" containerID="e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be" Sep 30 19:03:03 crc kubenswrapper[4772]: E0930 19:03:03.227366 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be\": container with ID starting with e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be not found: ID does not exist" containerID="e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.227411 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be"} err="failed to get container status \"e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be\": rpc error: code = NotFound desc = could not find container \"e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be\": container with ID starting with e70fda4c57be05b849168cad014202603a47b98c1072c35f8d3a4806e93d89be not found: ID does not exist" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.227437 4772 scope.go:117] "RemoveContainer" containerID="9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1" Sep 30 19:03:03 crc kubenswrapper[4772]: E0930 19:03:03.227805 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1\": container with ID starting with 9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1 not found: ID does not exist" containerID="9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.227869 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1"} err="failed to get container status \"9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1\": rpc error: code = NotFound desc = could not find container \"9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1\": container with ID starting with 9ab3df9d67a5920004cae1a187a3f65278ad3bb2a42dbb8e33379aa0d3bb0be1 not found: ID does not exist" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.305476 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aac0568-fd9a-430b-b0de-b0692e5f851b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.460641 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jbkws"] Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.475909 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jbkws"] Sep 30 19:03:03 crc kubenswrapper[4772]: I0930 19:03:03.913639 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aac0568-fd9a-430b-b0de-b0692e5f851b" path="/var/lib/kubelet/pods/2aac0568-fd9a-430b-b0de-b0692e5f851b/volumes" Sep 30 19:03:04 crc kubenswrapper[4772]: I0930 19:03:04.135609 4772 generic.go:334] "Generic (PLEG): container finished" podID="2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" containerID="dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33" exitCode=0 Sep 30 19:03:04 crc kubenswrapper[4772]: I0930 19:03:04.135693 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmjnr/must-gather-tcnzd" event={"ID":"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd","Type":"ContainerDied","Data":"dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33"} Sep 30 19:03:04 crc kubenswrapper[4772]: I0930 19:03:04.137291 4772 scope.go:117] "RemoveContainer" containerID="dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33" Sep 30 19:03:04 crc kubenswrapper[4772]: I0930 19:03:04.980663 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cmjnr_must-gather-tcnzd_2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd/gather/0.log" Sep 30 19:03:08 crc kubenswrapper[4772]: I0930 19:03:08.655084 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:03:08 crc kubenswrapper[4772]: I0930 19:03:08.655522 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:03:14 crc kubenswrapper[4772]: I0930 19:03:14.792576 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cmjnr/must-gather-tcnzd"] Sep 30 19:03:14 crc kubenswrapper[4772]: I0930 19:03:14.793749 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-cmjnr/must-gather-tcnzd" podUID="2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" containerName="copy" containerID="cri-o://1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b" gracePeriod=2 Sep 30 19:03:14 crc kubenswrapper[4772]: I0930 19:03:14.809172 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cmjnr/must-gather-tcnzd"] Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.267389 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cmjnr_must-gather-tcnzd_2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd/copy/0.log" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.268865 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/must-gather-tcnzd" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.304965 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-must-gather-output\") pod \"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd\" (UID: \"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd\") " Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.305370 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcq95\" (UniqueName: \"kubernetes.io/projected/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-kube-api-access-gcq95\") pod \"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd\" (UID: \"2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd\") " Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.313277 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-kube-api-access-gcq95" (OuterVolumeSpecName: "kube-api-access-gcq95") pod "2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" (UID: "2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd"). InnerVolumeSpecName "kube-api-access-gcq95". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.323787 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cmjnr_must-gather-tcnzd_2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd/copy/0.log" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.328272 4772 generic.go:334] "Generic (PLEG): container finished" podID="2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" containerID="1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b" exitCode=143 Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.328361 4772 scope.go:117] "RemoveContainer" containerID="1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.328600 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmjnr/must-gather-tcnzd" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.370722 4772 scope.go:117] "RemoveContainer" containerID="dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.410052 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcq95\" (UniqueName: \"kubernetes.io/projected/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-kube-api-access-gcq95\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.466470 4772 scope.go:117] "RemoveContainer" containerID="1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b" Sep 30 19:03:15 crc kubenswrapper[4772]: E0930 19:03:15.467079 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b\": container with ID starting with 1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b not found: ID does not exist" containerID="1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.467139 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b"} err="failed to get container status \"1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b\": rpc error: code = NotFound desc = could not find container \"1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b\": container with ID starting with 1cf222ebc96654ffeabf5d3eb2f46c5dcd75ee92d3a06a945547c8f859900d5b not found: ID does not exist" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.467176 4772 scope.go:117] "RemoveContainer" containerID="dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33" Sep 30 19:03:15 crc kubenswrapper[4772]: E0930 19:03:15.467779 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33\": container with ID starting with dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33 not found: ID does not exist" containerID="dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.467818 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33"} err="failed to get container status \"dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33\": rpc error: code = NotFound desc = could not find container \"dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33\": container with ID starting with dafbef6bac3e48cc7e43c11aa43b5fa2ba103a20ddf4cd12af2113eea42b1a33 not found: ID does not exist" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.540424 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" (UID: "2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.615347 4772 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:15 crc kubenswrapper[4772]: I0930 19:03:15.910452 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" path="/var/lib/kubelet/pods/2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd/volumes" Sep 30 19:03:38 crc kubenswrapper[4772]: I0930 19:03:38.655774 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:03:38 crc kubenswrapper[4772]: I0930 19:03:38.656955 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:03:38 crc kubenswrapper[4772]: I0930 19:03:38.657045 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 19:03:38 crc kubenswrapper[4772]: I0930 19:03:38.658453 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:03:38 crc kubenswrapper[4772]: I0930 19:03:38.658539 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" gracePeriod=600 Sep 30 19:03:38 crc kubenswrapper[4772]: E0930 19:03:38.790649 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.619698 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" exitCode=0 Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.619764 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203"} Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.619802 4772 scope.go:117] "RemoveContainer" containerID="77c9607ed2a0e38d68bcdc15808423e68f6c0324df585a7ce24dade791feb774" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.620906 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:03:39 crc kubenswrapper[4772]: E0930 19:03:39.621478 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.885283 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rs27s/must-gather-v5kw2"] Sep 30 19:03:39 crc kubenswrapper[4772]: E0930 19:03:39.886120 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" containerName="copy" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.886136 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" containerName="copy" Sep 30 19:03:39 crc kubenswrapper[4772]: E0930 19:03:39.886154 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerName="extract-content" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.886160 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerName="extract-content" Sep 30 19:03:39 crc kubenswrapper[4772]: E0930 19:03:39.886171 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" containerName="gather" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.886177 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" containerName="gather" Sep 30 19:03:39 crc kubenswrapper[4772]: E0930 19:03:39.886187 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerName="registry-server" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.886193 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerName="registry-server" Sep 30 19:03:39 crc kubenswrapper[4772]: E0930 19:03:39.886220 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerName="extract-utilities" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.886226 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerName="extract-utilities" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.886414 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" containerName="gather" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.886427 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aac0568-fd9a-430b-b0de-b0692e5f851b" containerName="registry-server" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.886437 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2048273a-0bb5-4dd8-ab4e-9d08c0a13fbd" containerName="copy" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.892209 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/must-gather-v5kw2" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.913338 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rs27s"/"default-dockercfg-klzrz" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.914015 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rs27s"/"kube-root-ca.crt" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.914251 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rs27s"/"openshift-service-ca.crt" Sep 30 19:03:39 crc kubenswrapper[4772]: I0930 19:03:39.934365 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rs27s/must-gather-v5kw2"] Sep 30 19:03:40 crc kubenswrapper[4772]: I0930 19:03:40.001218 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8421b437-89eb-482f-a03b-898e483527e1-must-gather-output\") pod \"must-gather-v5kw2\" (UID: \"8421b437-89eb-482f-a03b-898e483527e1\") " pod="openshift-must-gather-rs27s/must-gather-v5kw2" Sep 30 19:03:40 crc kubenswrapper[4772]: I0930 19:03:40.001322 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhx2t\" (UniqueName: \"kubernetes.io/projected/8421b437-89eb-482f-a03b-898e483527e1-kube-api-access-qhx2t\") pod \"must-gather-v5kw2\" (UID: \"8421b437-89eb-482f-a03b-898e483527e1\") " pod="openshift-must-gather-rs27s/must-gather-v5kw2" Sep 30 19:03:40 crc kubenswrapper[4772]: I0930 19:03:40.104090 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8421b437-89eb-482f-a03b-898e483527e1-must-gather-output\") pod \"must-gather-v5kw2\" (UID: \"8421b437-89eb-482f-a03b-898e483527e1\") " pod="openshift-must-gather-rs27s/must-gather-v5kw2" Sep 30 19:03:40 crc kubenswrapper[4772]: I0930 19:03:40.104190 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhx2t\" (UniqueName: \"kubernetes.io/projected/8421b437-89eb-482f-a03b-898e483527e1-kube-api-access-qhx2t\") pod \"must-gather-v5kw2\" (UID: \"8421b437-89eb-482f-a03b-898e483527e1\") " pod="openshift-must-gather-rs27s/must-gather-v5kw2" Sep 30 19:03:40 crc kubenswrapper[4772]: I0930 19:03:40.104832 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8421b437-89eb-482f-a03b-898e483527e1-must-gather-output\") pod \"must-gather-v5kw2\" (UID: \"8421b437-89eb-482f-a03b-898e483527e1\") " pod="openshift-must-gather-rs27s/must-gather-v5kw2" Sep 30 19:03:40 crc kubenswrapper[4772]: I0930 19:03:40.128414 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhx2t\" (UniqueName: \"kubernetes.io/projected/8421b437-89eb-482f-a03b-898e483527e1-kube-api-access-qhx2t\") pod \"must-gather-v5kw2\" (UID: \"8421b437-89eb-482f-a03b-898e483527e1\") " pod="openshift-must-gather-rs27s/must-gather-v5kw2" Sep 30 19:03:40 crc kubenswrapper[4772]: I0930 19:03:40.220604 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/must-gather-v5kw2" Sep 30 19:03:40 crc kubenswrapper[4772]: I0930 19:03:40.988106 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rs27s/must-gather-v5kw2"] Sep 30 19:03:41 crc kubenswrapper[4772]: I0930 19:03:41.688291 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/must-gather-v5kw2" event={"ID":"8421b437-89eb-482f-a03b-898e483527e1","Type":"ContainerStarted","Data":"2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed"} Sep 30 19:03:41 crc kubenswrapper[4772]: I0930 19:03:41.688598 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/must-gather-v5kw2" event={"ID":"8421b437-89eb-482f-a03b-898e483527e1","Type":"ContainerStarted","Data":"6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9"} Sep 30 19:03:41 crc kubenswrapper[4772]: I0930 19:03:41.688614 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/must-gather-v5kw2" event={"ID":"8421b437-89eb-482f-a03b-898e483527e1","Type":"ContainerStarted","Data":"0d7bfb542eb3f608606b6e6647fd38efd40fe6b9dd1b6171fc02eb9e688faceb"} Sep 30 19:03:41 crc kubenswrapper[4772]: I0930 19:03:41.718757 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rs27s/must-gather-v5kw2" podStartSLOduration=2.718724166 podStartE2EDuration="2.718724166s" podCreationTimestamp="2025-09-30 19:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:03:41.706516929 +0000 UTC m=+7322.613529780" watchObservedRunningTime="2025-09-30 19:03:41.718724166 +0000 UTC m=+7322.625737007" Sep 30 19:03:45 crc kubenswrapper[4772]: I0930 19:03:45.328099 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rs27s/crc-debug-j7f6j"] Sep 30 19:03:45 crc kubenswrapper[4772]: I0930 19:03:45.330877 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-j7f6j" Sep 30 19:03:45 crc kubenswrapper[4772]: I0930 19:03:45.468874 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shr7x\" (UniqueName: \"kubernetes.io/projected/b4f5263a-e583-4269-80e5-98750b7e5e6d-kube-api-access-shr7x\") pod \"crc-debug-j7f6j\" (UID: \"b4f5263a-e583-4269-80e5-98750b7e5e6d\") " pod="openshift-must-gather-rs27s/crc-debug-j7f6j" Sep 30 19:03:45 crc kubenswrapper[4772]: I0930 19:03:45.469011 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4f5263a-e583-4269-80e5-98750b7e5e6d-host\") pod \"crc-debug-j7f6j\" (UID: \"b4f5263a-e583-4269-80e5-98750b7e5e6d\") " pod="openshift-must-gather-rs27s/crc-debug-j7f6j" Sep 30 19:03:45 crc kubenswrapper[4772]: I0930 19:03:45.571453 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shr7x\" (UniqueName: \"kubernetes.io/projected/b4f5263a-e583-4269-80e5-98750b7e5e6d-kube-api-access-shr7x\") pod \"crc-debug-j7f6j\" (UID: \"b4f5263a-e583-4269-80e5-98750b7e5e6d\") " pod="openshift-must-gather-rs27s/crc-debug-j7f6j" Sep 30 19:03:45 crc kubenswrapper[4772]: I0930 19:03:45.571584 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4f5263a-e583-4269-80e5-98750b7e5e6d-host\") pod \"crc-debug-j7f6j\" (UID: \"b4f5263a-e583-4269-80e5-98750b7e5e6d\") " pod="openshift-must-gather-rs27s/crc-debug-j7f6j" Sep 30 19:03:45 crc kubenswrapper[4772]: I0930 19:03:45.571857 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4f5263a-e583-4269-80e5-98750b7e5e6d-host\") pod \"crc-debug-j7f6j\" (UID: \"b4f5263a-e583-4269-80e5-98750b7e5e6d\") " pod="openshift-must-gather-rs27s/crc-debug-j7f6j" Sep 30 19:03:45 crc kubenswrapper[4772]: I0930 19:03:45.618955 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shr7x\" (UniqueName: \"kubernetes.io/projected/b4f5263a-e583-4269-80e5-98750b7e5e6d-kube-api-access-shr7x\") pod \"crc-debug-j7f6j\" (UID: \"b4f5263a-e583-4269-80e5-98750b7e5e6d\") " pod="openshift-must-gather-rs27s/crc-debug-j7f6j" Sep 30 19:03:45 crc kubenswrapper[4772]: I0930 19:03:45.657006 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-j7f6j" Sep 30 19:03:45 crc kubenswrapper[4772]: I0930 19:03:45.750155 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/crc-debug-j7f6j" event={"ID":"b4f5263a-e583-4269-80e5-98750b7e5e6d","Type":"ContainerStarted","Data":"6cefe0d9c94c11e33fc10f8bbc5f71d6e5e2d2929405ec46217af071040ac4cf"} Sep 30 19:03:46 crc kubenswrapper[4772]: I0930 19:03:46.760037 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/crc-debug-j7f6j" event={"ID":"b4f5263a-e583-4269-80e5-98750b7e5e6d","Type":"ContainerStarted","Data":"16b4182f652abda7ae9ddc4be51a1417db155c0470605b6305aa80b843643714"} Sep 30 19:03:46 crc kubenswrapper[4772]: I0930 19:03:46.782140 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rs27s/crc-debug-j7f6j" podStartSLOduration=1.7821168539999999 podStartE2EDuration="1.782116854s" podCreationTimestamp="2025-09-30 19:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:03:46.779694501 +0000 UTC m=+7327.686707332" watchObservedRunningTime="2025-09-30 19:03:46.782116854 +0000 UTC m=+7327.689129685" Sep 30 19:03:50 crc kubenswrapper[4772]: I0930 19:03:50.900872 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:03:50 crc kubenswrapper[4772]: E0930 19:03:50.901953 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:04:01 crc kubenswrapper[4772]: I0930 19:04:01.899118 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:04:01 crc kubenswrapper[4772]: E0930 19:04:01.900148 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:04:14 crc kubenswrapper[4772]: I0930 19:04:14.899744 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:04:14 crc kubenswrapper[4772]: E0930 19:04:14.900647 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:04:29 crc kubenswrapper[4772]: I0930 19:04:29.909296 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:04:29 crc kubenswrapper[4772]: E0930 19:04:29.910688 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:04:42 crc kubenswrapper[4772]: I0930 19:04:42.898673 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:04:42 crc kubenswrapper[4772]: E0930 19:04:42.899624 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:04:48 crc kubenswrapper[4772]: I0930 19:04:48.508292 4772 scope.go:117] "RemoveContainer" containerID="9edcc30c573a25a36ab5123e1fc71203093223d6f12f69b2c4ac95f1d53bf0c4" Sep 30 19:04:56 crc kubenswrapper[4772]: I0930 19:04:56.898992 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:04:56 crc kubenswrapper[4772]: E0930 19:04:56.899953 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.772642 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ccx9l"] Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.775801 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.787948 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ccx9l"] Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.882700 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhbj7\" (UniqueName: \"kubernetes.io/projected/1695a4ce-a501-4556-97d1-4da0d997f38c-kube-api-access-lhbj7\") pod \"redhat-operators-ccx9l\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.882792 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-catalog-content\") pod \"redhat-operators-ccx9l\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.882840 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-utilities\") pod \"redhat-operators-ccx9l\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.990023 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhbj7\" (UniqueName: \"kubernetes.io/projected/1695a4ce-a501-4556-97d1-4da0d997f38c-kube-api-access-lhbj7\") pod \"redhat-operators-ccx9l\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.990404 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-catalog-content\") pod \"redhat-operators-ccx9l\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.990546 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-utilities\") pod \"redhat-operators-ccx9l\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.990866 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-catalog-content\") pod \"redhat-operators-ccx9l\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:04:59 crc kubenswrapper[4772]: I0930 19:04:59.991332 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-utilities\") pod \"redhat-operators-ccx9l\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:05:00 crc kubenswrapper[4772]: I0930 19:05:00.013467 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhbj7\" (UniqueName: \"kubernetes.io/projected/1695a4ce-a501-4556-97d1-4da0d997f38c-kube-api-access-lhbj7\") pod \"redhat-operators-ccx9l\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:05:00 crc kubenswrapper[4772]: I0930 19:05:00.124701 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:05:00 crc kubenswrapper[4772]: I0930 19:05:00.651779 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ccx9l"] Sep 30 19:05:01 crc kubenswrapper[4772]: I0930 19:05:01.657022 4772 generic.go:334] "Generic (PLEG): container finished" podID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerID="16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e" exitCode=0 Sep 30 19:05:01 crc kubenswrapper[4772]: I0930 19:05:01.657194 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccx9l" event={"ID":"1695a4ce-a501-4556-97d1-4da0d997f38c","Type":"ContainerDied","Data":"16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e"} Sep 30 19:05:01 crc kubenswrapper[4772]: I0930 19:05:01.657601 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccx9l" event={"ID":"1695a4ce-a501-4556-97d1-4da0d997f38c","Type":"ContainerStarted","Data":"29f4cee3b0c244e19270b4f49ec210f154116271be8f9b1ed00a08283ec99f45"} Sep 30 19:05:03 crc kubenswrapper[4772]: I0930 19:05:03.681737 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccx9l" event={"ID":"1695a4ce-a501-4556-97d1-4da0d997f38c","Type":"ContainerStarted","Data":"607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef"} Sep 30 19:05:07 crc kubenswrapper[4772]: I0930 19:05:07.731696 4772 generic.go:334] "Generic (PLEG): container finished" podID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerID="607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef" exitCode=0 Sep 30 19:05:07 crc kubenswrapper[4772]: I0930 19:05:07.731998 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccx9l" event={"ID":"1695a4ce-a501-4556-97d1-4da0d997f38c","Type":"ContainerDied","Data":"607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef"} Sep 30 19:05:07 crc kubenswrapper[4772]: I0930 19:05:07.902693 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:05:07 crc kubenswrapper[4772]: E0930 19:05:07.903039 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:05:08 crc kubenswrapper[4772]: I0930 19:05:08.746757 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccx9l" event={"ID":"1695a4ce-a501-4556-97d1-4da0d997f38c","Type":"ContainerStarted","Data":"71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf"} Sep 30 19:05:08 crc kubenswrapper[4772]: I0930 19:05:08.776411 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ccx9l" podStartSLOduration=3.245869657 podStartE2EDuration="9.776385025s" podCreationTimestamp="2025-09-30 19:04:59 +0000 UTC" firstStartedPulling="2025-09-30 19:05:01.660763534 +0000 UTC m=+7402.567776375" lastFinishedPulling="2025-09-30 19:05:08.191278912 +0000 UTC m=+7409.098291743" observedRunningTime="2025-09-30 19:05:08.764138297 +0000 UTC m=+7409.671151128" watchObservedRunningTime="2025-09-30 19:05:08.776385025 +0000 UTC m=+7409.683397856" Sep 30 19:05:10 crc kubenswrapper[4772]: I0930 19:05:10.124963 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:05:10 crc kubenswrapper[4772]: I0930 19:05:10.125444 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:05:11 crc kubenswrapper[4772]: I0930 19:05:11.185370 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ccx9l" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerName="registry-server" probeResult="failure" output=< Sep 30 19:05:11 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Sep 30 19:05:11 crc kubenswrapper[4772]: > Sep 30 19:05:21 crc kubenswrapper[4772]: I0930 19:05:21.188398 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ccx9l" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerName="registry-server" probeResult="failure" output=< Sep 30 19:05:21 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Sep 30 19:05:21 crc kubenswrapper[4772]: > Sep 30 19:05:22 crc kubenswrapper[4772]: I0930 19:05:22.898272 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:05:22 crc kubenswrapper[4772]: E0930 19:05:22.899628 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:05:23 crc kubenswrapper[4772]: I0930 19:05:23.959742 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-859bb54b8b-6n9dj_190ea63c-c6c0-47e8-988c-bd89113ef485/barbican-api/0.log" Sep 30 19:05:23 crc kubenswrapper[4772]: I0930 19:05:23.978124 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-859bb54b8b-6n9dj_190ea63c-c6c0-47e8-988c-bd89113ef485/barbican-api-log/0.log" Sep 30 19:05:24 crc kubenswrapper[4772]: I0930 19:05:24.219335 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f749c9554-fhqsc_84dc6bc7-3f82-4108-afa6-15ac7055676a/barbican-keystone-listener/0.log" Sep 30 19:05:24 crc kubenswrapper[4772]: I0930 19:05:24.309862 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f749c9554-fhqsc_84dc6bc7-3f82-4108-afa6-15ac7055676a/barbican-keystone-listener-log/0.log" Sep 30 19:05:24 crc kubenswrapper[4772]: I0930 19:05:24.486925 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-658db7799c-88bsl_31b84c03-7c14-47a5-9f86-cca25e0bf92e/barbican-worker/0.log" Sep 30 19:05:24 crc kubenswrapper[4772]: I0930 19:05:24.529717 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-658db7799c-88bsl_31b84c03-7c14-47a5-9f86-cca25e0bf92e/barbican-worker-log/0.log" Sep 30 19:05:24 crc kubenswrapper[4772]: I0930 19:05:24.756142 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-sx67h_f953fcc8-8726-4ec2-a493-d67f3f540054/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:25 crc kubenswrapper[4772]: I0930 19:05:25.031623 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5/ceilometer-central-agent/0.log" Sep 30 19:05:25 crc kubenswrapper[4772]: I0930 19:05:25.048693 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5/ceilometer-notification-agent/0.log" Sep 30 19:05:25 crc kubenswrapper[4772]: I0930 19:05:25.114767 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5/proxy-httpd/0.log" Sep 30 19:05:25 crc kubenswrapper[4772]: I0930 19:05:25.232103 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b38e508-02ac-4e05-aca3-2eb9e5ccd4b5/sg-core/0.log" Sep 30 19:05:25 crc kubenswrapper[4772]: I0930 19:05:25.294547 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-2s8nc_bcbe354d-fa6d-4cb9-a111-ba8dc3e38ad4/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:25 crc kubenswrapper[4772]: I0930 19:05:25.447733 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-b6m5j_d001e435-b677-46e3-a31b-f5d1ae7e5c01/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:25 crc kubenswrapper[4772]: I0930 19:05:25.727361 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_71eb05f1-a375-49c7-965d-ae495649ac7c/cinder-api-log/0.log" Sep 30 19:05:26 crc kubenswrapper[4772]: I0930 19:05:26.064234 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058/probe/0.log" Sep 30 19:05:26 crc kubenswrapper[4772]: I0930 19:05:26.414631 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2e083d29-4d17-4b01-9201-dfbed0f1f304/cinder-scheduler/0.log" Sep 30 19:05:26 crc kubenswrapper[4772]: I0930 19:05:26.513486 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2e083d29-4d17-4b01-9201-dfbed0f1f304/probe/0.log" Sep 30 19:05:26 crc kubenswrapper[4772]: I0930 19:05:26.848726 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_9f6f9cbd-04ed-4bf0-b5cb-76a0f561c058/cinder-backup/0.log" Sep 30 19:05:26 crc kubenswrapper[4772]: I0930 19:05:26.900196 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_b6b0b394-e87c-4287-ab65-5652e2cc09e1/probe/0.log" Sep 30 19:05:27 crc kubenswrapper[4772]: I0930 19:05:27.071032 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_b6b0b394-e87c-4287-ab65-5652e2cc09e1/cinder-volume/0.log" Sep 30 19:05:27 crc kubenswrapper[4772]: I0930 19:05:27.110880 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_71eb05f1-a375-49c7-965d-ae495649ac7c/cinder-api/0.log" Sep 30 19:05:27 crc kubenswrapper[4772]: I0930 19:05:27.292824 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume2-0_08a95766-93a6-47b7-bce4-c556f7064db0/cinder-volume/0.log" Sep 30 19:05:27 crc kubenswrapper[4772]: I0930 19:05:27.365024 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume2-0_08a95766-93a6-47b7-bce4-c556f7064db0/probe/0.log" Sep 30 19:05:27 crc kubenswrapper[4772]: I0930 19:05:27.397735 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vmjps_f9277e5c-9f8e-4c7c-a979-03fce35dab53/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:27 crc kubenswrapper[4772]: I0930 19:05:27.554348 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-t2bgx_104de20c-fde6-42d5-aa8b-f23445a3661e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:27 crc kubenswrapper[4772]: I0930 19:05:27.627196 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5498b49c99-7mbh2_51a7ec88-f2d8-434d-88ea-3e3ce6c639c5/init/0.log" Sep 30 19:05:27 crc kubenswrapper[4772]: I0930 19:05:27.809430 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5498b49c99-7mbh2_51a7ec88-f2d8-434d-88ea-3e3ce6c639c5/init/0.log" Sep 30 19:05:27 crc kubenswrapper[4772]: I0930 19:05:27.868366 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7019346d-46b6-4f97-b309-58376e8a2d2a/glance-httpd/0.log" Sep 30 19:05:28 crc kubenswrapper[4772]: I0930 19:05:28.034992 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7019346d-46b6-4f97-b309-58376e8a2d2a/glance-log/0.log" Sep 30 19:05:28 crc kubenswrapper[4772]: I0930 19:05:28.176268 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d3ca2624-92a6-4bcf-bbb6-4780637bef02/glance-httpd/0.log" Sep 30 19:05:28 crc kubenswrapper[4772]: I0930 19:05:28.264776 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d3ca2624-92a6-4bcf-bbb6-4780637bef02/glance-log/0.log" Sep 30 19:05:28 crc kubenswrapper[4772]: I0930 19:05:28.635177 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dd79c8f84-lx2fj_bb98c606-aef7-46e5-8242-7ebd28d542ba/horizon/0.log" Sep 30 19:05:28 crc kubenswrapper[4772]: I0930 19:05:28.816309 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5498b49c99-7mbh2_51a7ec88-f2d8-434d-88ea-3e3ce6c639c5/dnsmasq-dns/0.log" Sep 30 19:05:28 crc kubenswrapper[4772]: I0930 19:05:28.849926 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-t42nj_32420052-34e9-4cca-a4ee-239d3416cd9a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:29 crc kubenswrapper[4772]: I0930 19:05:29.115483 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-nswzh_71289a51-de10-4dea-8aca-4a3cbd177e65/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:29 crc kubenswrapper[4772]: I0930 19:05:29.207685 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dd79c8f84-lx2fj_bb98c606-aef7-46e5-8242-7ebd28d542ba/horizon-log/0.log" Sep 30 19:05:29 crc kubenswrapper[4772]: I0930 19:05:29.385198 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320921-q2m64_3ca146bb-c6d4-4c27-bae2-ea38c01dd0f7/keystone-cron/0.log" Sep 30 19:05:29 crc kubenswrapper[4772]: I0930 19:05:29.569363 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320981-lpckp_c1e6427c-23e2-45dd-897f-127c199eecbf/keystone-cron/0.log" Sep 30 19:05:29 crc kubenswrapper[4772]: I0930 19:05:29.744359 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ed142cd3-4d43-4293-af1f-d2a76649b5a2/kube-state-metrics/0.log" Sep 30 19:05:29 crc kubenswrapper[4772]: I0930 19:05:29.868312 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76ff4c9cf5-7gpvg_43dbf436-1404-454d-ab9a-870ba144ade3/keystone-api/0.log" Sep 30 19:05:29 crc kubenswrapper[4772]: I0930 19:05:29.985171 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-jhgpz_cc4ef050-7f47-4f1f-a62e-4607d290ddf3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:30 crc kubenswrapper[4772]: I0930 19:05:30.258320 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:05:30 crc kubenswrapper[4772]: I0930 19:05:30.346594 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:05:30 crc kubenswrapper[4772]: I0930 19:05:30.850003 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-d5sjf_6aa08aae-dcd8-4e65-80ab-d6e2f1f606fb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:30 crc kubenswrapper[4772]: I0930 19:05:30.940462 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-684cbd44c-xstzf_d878293c-0383-4575-95cb-1062bcb4634e/neutron-httpd/0.log" Sep 30 19:05:30 crc kubenswrapper[4772]: I0930 19:05:30.971803 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-684cbd44c-xstzf_d878293c-0383-4575-95cb-1062bcb4634e/neutron-api/0.log" Sep 30 19:05:30 crc kubenswrapper[4772]: I0930 19:05:30.976005 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ccx9l"] Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.018759 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ccx9l" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerName="registry-server" containerID="cri-o://71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf" gracePeriod=2 Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.070501 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a242b41e-98a7-4814-984c-70b36be61cb9/nova-cell0-conductor-conductor/0.log" Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.626647 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.702652 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-utilities\") pod \"1695a4ce-a501-4556-97d1-4da0d997f38c\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.702726 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhbj7\" (UniqueName: \"kubernetes.io/projected/1695a4ce-a501-4556-97d1-4da0d997f38c-kube-api-access-lhbj7\") pod \"1695a4ce-a501-4556-97d1-4da0d997f38c\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.702928 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-catalog-content\") pod \"1695a4ce-a501-4556-97d1-4da0d997f38c\" (UID: \"1695a4ce-a501-4556-97d1-4da0d997f38c\") " Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.704000 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-utilities" (OuterVolumeSpecName: "utilities") pod "1695a4ce-a501-4556-97d1-4da0d997f38c" (UID: "1695a4ce-a501-4556-97d1-4da0d997f38c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.713169 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1695a4ce-a501-4556-97d1-4da0d997f38c-kube-api-access-lhbj7" (OuterVolumeSpecName: "kube-api-access-lhbj7") pod "1695a4ce-a501-4556-97d1-4da0d997f38c" (UID: "1695a4ce-a501-4556-97d1-4da0d997f38c"). InnerVolumeSpecName "kube-api-access-lhbj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.788899 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1695a4ce-a501-4556-97d1-4da0d997f38c" (UID: "1695a4ce-a501-4556-97d1-4da0d997f38c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.806594 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.806744 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhbj7\" (UniqueName: \"kubernetes.io/projected/1695a4ce-a501-4556-97d1-4da0d997f38c-kube-api-access-lhbj7\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.806757 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1695a4ce-a501-4556-97d1-4da0d997f38c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:32 crc kubenswrapper[4772]: I0930 19:05:32.959623 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_be6b14d4-5d20-4cae-add5-4702dc26ecc5/nova-cell1-conductor-conductor/0.log" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.034770 4772 generic.go:334] "Generic (PLEG): container finished" podID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerID="71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf" exitCode=0 Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.034844 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccx9l" event={"ID":"1695a4ce-a501-4556-97d1-4da0d997f38c","Type":"ContainerDied","Data":"71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf"} Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.034889 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccx9l" event={"ID":"1695a4ce-a501-4556-97d1-4da0d997f38c","Type":"ContainerDied","Data":"29f4cee3b0c244e19270b4f49ec210f154116271be8f9b1ed00a08283ec99f45"} Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.034915 4772 scope.go:117] "RemoveContainer" containerID="71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.034924 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccx9l" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.068783 4772 scope.go:117] "RemoveContainer" containerID="607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.099038 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ccx9l"] Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.108629 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ccx9l"] Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.110879 4772 scope.go:117] "RemoveContainer" containerID="16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.151593 4772 scope.go:117] "RemoveContainer" containerID="71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf" Sep 30 19:05:33 crc kubenswrapper[4772]: E0930 19:05:33.152235 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf\": container with ID starting with 71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf not found: ID does not exist" containerID="71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.152276 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf"} err="failed to get container status \"71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf\": rpc error: code = NotFound desc = could not find container \"71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf\": container with ID starting with 71e98fed07fd39a4180b661cf67bd85c7fe1ad7431a18b022622d165450c5fbf not found: ID does not exist" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.152302 4772 scope.go:117] "RemoveContainer" containerID="607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef" Sep 30 19:05:33 crc kubenswrapper[4772]: E0930 19:05:33.152544 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef\": container with ID starting with 607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef not found: ID does not exist" containerID="607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.152572 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef"} err="failed to get container status \"607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef\": rpc error: code = NotFound desc = could not find container \"607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef\": container with ID starting with 607eaba9f7a2944abf24526f7f347fc78b2bf85df9f897fbbd3a9b0b7a7caeef not found: ID does not exist" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.152589 4772 scope.go:117] "RemoveContainer" containerID="16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e" Sep 30 19:05:33 crc kubenswrapper[4772]: E0930 19:05:33.152871 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e\": container with ID starting with 16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e not found: ID does not exist" containerID="16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.152921 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e"} err="failed to get container status \"16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e\": rpc error: code = NotFound desc = could not find container \"16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e\": container with ID starting with 16463d05d6ce8493a9e04a9a423e8887769e2c7475020bc7fb7a107407c5079e not found: ID does not exist" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.631316 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_19205b6f-4fbc-4114-809f-3f105f8469bb/nova-api-log/0.log" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.834766 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_21aeeee6-b52d-4cd0-b635-085708b6e9d9/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 19:05:33 crc kubenswrapper[4772]: I0930 19:05:33.912836 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" path="/var/lib/kubelet/pods/1695a4ce-a501-4556-97d1-4da0d997f38c/volumes" Sep 30 19:05:34 crc kubenswrapper[4772]: I0930 19:05:34.222373 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2p7bv_52dddfd0-5fcc-47be-96c2-e3427fc66069/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:34 crc kubenswrapper[4772]: I0930 19:05:34.314243 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_19205b6f-4fbc-4114-809f-3f105f8469bb/nova-api-api/0.log" Sep 30 19:05:34 crc kubenswrapper[4772]: I0930 19:05:34.430680 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_dc608be5-335a-4080-9a63-9266b733dde3/nova-metadata-log/0.log" Sep 30 19:05:35 crc kubenswrapper[4772]: I0930 19:05:35.063546 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4b4b3176-3882-486d-8217-54f429906f49/mysql-bootstrap/0.log" Sep 30 19:05:35 crc kubenswrapper[4772]: I0930 19:05:35.130423 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_072e2a3c-9da9-4b3d-ab28-05338d20eb88/nova-scheduler-scheduler/0.log" Sep 30 19:05:35 crc kubenswrapper[4772]: I0930 19:05:35.360321 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4b4b3176-3882-486d-8217-54f429906f49/mysql-bootstrap/0.log" Sep 30 19:05:35 crc kubenswrapper[4772]: I0930 19:05:35.389149 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4b4b3176-3882-486d-8217-54f429906f49/galera/0.log" Sep 30 19:05:35 crc kubenswrapper[4772]: I0930 19:05:35.673366 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5548eec2-33be-42b2-9b84-572236f095db/mysql-bootstrap/0.log" Sep 30 19:05:35 crc kubenswrapper[4772]: I0930 19:05:35.916311 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5548eec2-33be-42b2-9b84-572236f095db/mysql-bootstrap/0.log" Sep 30 19:05:35 crc kubenswrapper[4772]: I0930 19:05:35.933212 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5548eec2-33be-42b2-9b84-572236f095db/galera/0.log" Sep 30 19:05:36 crc kubenswrapper[4772]: I0930 19:05:36.154493 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_45aef289-46c6-4393-9032-2fe923b5948a/openstackclient/0.log" Sep 30 19:05:36 crc kubenswrapper[4772]: I0930 19:05:36.399211 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6v6fm_d66affdf-221c-4a29-a1f7-0c3d7e4d4153/ovn-controller/0.log" Sep 30 19:05:36 crc kubenswrapper[4772]: I0930 19:05:36.655344 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fkkwr_43af7d7d-ee79-4c8c-b4fd-6789a382bab3/openstack-network-exporter/0.log" Sep 30 19:05:36 crc kubenswrapper[4772]: I0930 19:05:36.898090 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:05:36 crc kubenswrapper[4772]: E0930 19:05:36.898409 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:05:36 crc kubenswrapper[4772]: I0930 19:05:36.903383 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5kwk_e052869f-fd26-497b-9573-0ee6221fa96c/ovsdb-server-init/0.log" Sep 30 19:05:37 crc kubenswrapper[4772]: I0930 19:05:37.209814 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5kwk_e052869f-fd26-497b-9573-0ee6221fa96c/ovsdb-server-init/0.log" Sep 30 19:05:37 crc kubenswrapper[4772]: I0930 19:05:37.411554 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5kwk_e052869f-fd26-497b-9573-0ee6221fa96c/ovsdb-server/0.log" Sep 30 19:05:37 crc kubenswrapper[4772]: I0930 19:05:37.865118 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5kwk_e052869f-fd26-497b-9573-0ee6221fa96c/ovs-vswitchd/0.log" Sep 30 19:05:38 crc kubenswrapper[4772]: I0930 19:05:38.026694 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-llw7r_4e366f6f-7ee6-42c4-8a83-7cba085e2a46/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:38 crc kubenswrapper[4772]: I0930 19:05:38.254116 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_10ca909a-0a73-4f62-89a4-ed8ffac99539/openstack-network-exporter/0.log" Sep 30 19:05:38 crc kubenswrapper[4772]: I0930 19:05:38.335558 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_dc608be5-335a-4080-9a63-9266b733dde3/nova-metadata-metadata/0.log" Sep 30 19:05:38 crc kubenswrapper[4772]: I0930 19:05:38.354245 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_10ca909a-0a73-4f62-89a4-ed8ffac99539/ovn-northd/0.log" Sep 30 19:05:38 crc kubenswrapper[4772]: I0930 19:05:38.580531 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_99ec9fea-a439-415b-ac73-3c4d0242eeb3/openstack-network-exporter/0.log" Sep 30 19:05:38 crc kubenswrapper[4772]: I0930 19:05:38.587243 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_99ec9fea-a439-415b-ac73-3c4d0242eeb3/ovsdbserver-nb/0.log" Sep 30 19:05:38 crc kubenswrapper[4772]: I0930 19:05:38.808081 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_da545add-e15e-4ed4-b084-66691b57284b/openstack-network-exporter/0.log" Sep 30 19:05:38 crc kubenswrapper[4772]: I0930 19:05:38.891756 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_da545add-e15e-4ed4-b084-66691b57284b/ovsdbserver-sb/0.log" Sep 30 19:05:39 crc kubenswrapper[4772]: I0930 19:05:39.382244 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-79fbb4fcd8-68j8v_ef6c9261-05fa-449e-87ba-2c33d858daec/placement-api/0.log" Sep 30 19:05:39 crc kubenswrapper[4772]: I0930 19:05:39.462978 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-79fbb4fcd8-68j8v_ef6c9261-05fa-449e-87ba-2c33d858daec/placement-log/0.log" Sep 30 19:05:39 crc kubenswrapper[4772]: I0930 19:05:39.548632 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8881ab23-9d2d-4563-b838-7b4583805e4f/init-config-reloader/0.log" Sep 30 19:05:39 crc kubenswrapper[4772]: I0930 19:05:39.804093 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8881ab23-9d2d-4563-b838-7b4583805e4f/prometheus/0.log" Sep 30 19:05:39 crc kubenswrapper[4772]: I0930 19:05:39.804167 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8881ab23-9d2d-4563-b838-7b4583805e4f/init-config-reloader/0.log" Sep 30 19:05:39 crc kubenswrapper[4772]: I0930 19:05:39.854443 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8881ab23-9d2d-4563-b838-7b4583805e4f/config-reloader/0.log" Sep 30 19:05:40 crc kubenswrapper[4772]: I0930 19:05:40.062213 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8881ab23-9d2d-4563-b838-7b4583805e4f/thanos-sidecar/0.log" Sep 30 19:05:40 crc kubenswrapper[4772]: I0930 19:05:40.134682 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_442ae296-125c-4c92-97b3-f2c04dac157e/setup-container/0.log" Sep 30 19:05:40 crc kubenswrapper[4772]: I0930 19:05:40.397028 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_442ae296-125c-4c92-97b3-f2c04dac157e/setup-container/0.log" Sep 30 19:05:40 crc kubenswrapper[4772]: I0930 19:05:40.433975 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_442ae296-125c-4c92-97b3-f2c04dac157e/rabbitmq/0.log" Sep 30 19:05:40 crc kubenswrapper[4772]: I0930 19:05:40.602689 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_607217cf-8f90-4adb-bca7-0271ea8a7b9b/setup-container/0.log" Sep 30 19:05:40 crc kubenswrapper[4772]: I0930 19:05:40.911021 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_607217cf-8f90-4adb-bca7-0271ea8a7b9b/setup-container/0.log" Sep 30 19:05:40 crc kubenswrapper[4772]: I0930 19:05:40.956975 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_607217cf-8f90-4adb-bca7-0271ea8a7b9b/rabbitmq/0.log" Sep 30 19:05:41 crc kubenswrapper[4772]: I0930 19:05:41.208400 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cc65bd09-5d06-4b46-b8ca-c518e77acd9c/setup-container/0.log" Sep 30 19:05:41 crc kubenswrapper[4772]: I0930 19:05:41.378125 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cc65bd09-5d06-4b46-b8ca-c518e77acd9c/setup-container/0.log" Sep 30 19:05:41 crc kubenswrapper[4772]: I0930 19:05:41.436975 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cc65bd09-5d06-4b46-b8ca-c518e77acd9c/rabbitmq/0.log" Sep 30 19:05:41 crc kubenswrapper[4772]: I0930 19:05:41.596385 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-t4plk_267b3439-a782-4c26-b376-19d72ece7ea1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:41 crc kubenswrapper[4772]: I0930 19:05:41.796817 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-86h2v_489fcf90-05fb-484f-9cd9-6b403023229a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:42 crc kubenswrapper[4772]: I0930 19:05:41.999077 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vpf6j_2d5bcedc-1eef-4301-ac7f-af49c51fc9f3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:42 crc kubenswrapper[4772]: I0930 19:05:42.209134 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vlcvm_525c2cee-edb7-4953-b0c9-6f08b4496be5/ssh-known-hosts-edpm-deployment/0.log" Sep 30 19:05:42 crc kubenswrapper[4772]: I0930 19:05:42.468157 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-k2jql_96e3fb1c-95c8-4f59-9c3d-9546f6adf7a5/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:42 crc kubenswrapper[4772]: I0930 19:05:42.657733 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_10f9355c-b2c3-4893-86db-91551575a21e/tempest-tests-tempest-tests-runner/0.log" Sep 30 19:05:42 crc kubenswrapper[4772]: I0930 19:05:42.790274 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8efe2257-d089-4e71-b8cf-e80ca250b5d4/test-operator-logs-container/0.log" Sep 30 19:05:43 crc kubenswrapper[4772]: I0930 19:05:43.039335 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7szsj_688b1ee3-fe2c-4d2d-917f-17510c9d980a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:05:44 crc kubenswrapper[4772]: I0930 19:05:44.383491 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_bf22b9ce-256e-4ba4-95ba-53778c010876/watcher-applier/0.log" Sep 30 19:05:44 crc kubenswrapper[4772]: I0930 19:05:44.704733 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_a145ab07-1aa7-42d9-9ff7-83f68417fa0e/watcher-api-log/0.log" Sep 30 19:05:45 crc kubenswrapper[4772]: I0930 19:05:45.008306 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_69f02322-0ff1-410e-8b46-dd3b5f909963/watcher-decision-engine/3.log" Sep 30 19:05:49 crc kubenswrapper[4772]: I0930 19:05:49.696079 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_69f02322-0ff1-410e-8b46-dd3b5f909963/watcher-decision-engine/4.log" Sep 30 19:05:50 crc kubenswrapper[4772]: I0930 19:05:50.897795 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:05:50 crc kubenswrapper[4772]: E0930 19:05:50.898684 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:05:51 crc kubenswrapper[4772]: I0930 19:05:51.154078 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_a145ab07-1aa7-42d9-9ff7-83f68417fa0e/watcher-api/0.log" Sep 30 19:05:58 crc kubenswrapper[4772]: I0930 19:05:58.480968 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d0056c55-0e0c-4dc0-8739-4a6e05db35ea/memcached/0.log" Sep 30 19:06:01 crc kubenswrapper[4772]: I0930 19:06:01.898601 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:06:01 crc kubenswrapper[4772]: E0930 19:06:01.899742 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:06:13 crc kubenswrapper[4772]: I0930 19:06:13.489424 4772 generic.go:334] "Generic (PLEG): container finished" podID="b4f5263a-e583-4269-80e5-98750b7e5e6d" containerID="16b4182f652abda7ae9ddc4be51a1417db155c0470605b6305aa80b843643714" exitCode=0 Sep 30 19:06:13 crc kubenswrapper[4772]: I0930 19:06:13.489527 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/crc-debug-j7f6j" event={"ID":"b4f5263a-e583-4269-80e5-98750b7e5e6d","Type":"ContainerDied","Data":"16b4182f652abda7ae9ddc4be51a1417db155c0470605b6305aa80b843643714"} Sep 30 19:06:13 crc kubenswrapper[4772]: I0930 19:06:13.899745 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:06:13 crc kubenswrapper[4772]: E0930 19:06:13.900422 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:06:14 crc kubenswrapper[4772]: I0930 19:06:14.598230 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-j7f6j" Sep 30 19:06:14 crc kubenswrapper[4772]: I0930 19:06:14.634806 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rs27s/crc-debug-j7f6j"] Sep 30 19:06:14 crc kubenswrapper[4772]: I0930 19:06:14.645682 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rs27s/crc-debug-j7f6j"] Sep 30 19:06:14 crc kubenswrapper[4772]: I0930 19:06:14.649038 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shr7x\" (UniqueName: \"kubernetes.io/projected/b4f5263a-e583-4269-80e5-98750b7e5e6d-kube-api-access-shr7x\") pod \"b4f5263a-e583-4269-80e5-98750b7e5e6d\" (UID: \"b4f5263a-e583-4269-80e5-98750b7e5e6d\") " Sep 30 19:06:14 crc kubenswrapper[4772]: I0930 19:06:14.649361 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4f5263a-e583-4269-80e5-98750b7e5e6d-host\") pod \"b4f5263a-e583-4269-80e5-98750b7e5e6d\" (UID: \"b4f5263a-e583-4269-80e5-98750b7e5e6d\") " Sep 30 19:06:14 crc kubenswrapper[4772]: I0930 19:06:14.649401 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4f5263a-e583-4269-80e5-98750b7e5e6d-host" (OuterVolumeSpecName: "host") pod "b4f5263a-e583-4269-80e5-98750b7e5e6d" (UID: "b4f5263a-e583-4269-80e5-98750b7e5e6d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:06:14 crc kubenswrapper[4772]: I0930 19:06:14.649834 4772 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4f5263a-e583-4269-80e5-98750b7e5e6d-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:14 crc kubenswrapper[4772]: I0930 19:06:14.654273 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f5263a-e583-4269-80e5-98750b7e5e6d-kube-api-access-shr7x" (OuterVolumeSpecName: "kube-api-access-shr7x") pod "b4f5263a-e583-4269-80e5-98750b7e5e6d" (UID: "b4f5263a-e583-4269-80e5-98750b7e5e6d"). InnerVolumeSpecName "kube-api-access-shr7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:14 crc kubenswrapper[4772]: I0930 19:06:14.752194 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shr7x\" (UniqueName: \"kubernetes.io/projected/b4f5263a-e583-4269-80e5-98750b7e5e6d-kube-api-access-shr7x\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.509670 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cefe0d9c94c11e33fc10f8bbc5f71d6e5e2d2929405ec46217af071040ac4cf" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.509805 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-j7f6j" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.852389 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rs27s/crc-debug-fjdms"] Sep 30 19:06:15 crc kubenswrapper[4772]: E0930 19:06:15.852829 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f5263a-e583-4269-80e5-98750b7e5e6d" containerName="container-00" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.852843 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f5263a-e583-4269-80e5-98750b7e5e6d" containerName="container-00" Sep 30 19:06:15 crc kubenswrapper[4772]: E0930 19:06:15.852858 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerName="extract-utilities" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.852864 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerName="extract-utilities" Sep 30 19:06:15 crc kubenswrapper[4772]: E0930 19:06:15.852880 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerName="extract-content" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.852886 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerName="extract-content" Sep 30 19:06:15 crc kubenswrapper[4772]: E0930 19:06:15.852915 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerName="registry-server" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.852922 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerName="registry-server" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.853173 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1695a4ce-a501-4556-97d1-4da0d997f38c" containerName="registry-server" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.853193 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f5263a-e583-4269-80e5-98750b7e5e6d" containerName="container-00" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.853939 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-fjdms" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.911136 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f5263a-e583-4269-80e5-98750b7e5e6d" path="/var/lib/kubelet/pods/b4f5263a-e583-4269-80e5-98750b7e5e6d/volumes" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.978953 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9115c584-1aa7-4826-9763-d172f3fc198d-host\") pod \"crc-debug-fjdms\" (UID: \"9115c584-1aa7-4826-9763-d172f3fc198d\") " pod="openshift-must-gather-rs27s/crc-debug-fjdms" Sep 30 19:06:15 crc kubenswrapper[4772]: I0930 19:06:15.979321 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhxl\" (UniqueName: \"kubernetes.io/projected/9115c584-1aa7-4826-9763-d172f3fc198d-kube-api-access-8nhxl\") pod \"crc-debug-fjdms\" (UID: \"9115c584-1aa7-4826-9763-d172f3fc198d\") " pod="openshift-must-gather-rs27s/crc-debug-fjdms" Sep 30 19:06:16 crc kubenswrapper[4772]: I0930 19:06:16.081021 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhxl\" (UniqueName: \"kubernetes.io/projected/9115c584-1aa7-4826-9763-d172f3fc198d-kube-api-access-8nhxl\") pod \"crc-debug-fjdms\" (UID: \"9115c584-1aa7-4826-9763-d172f3fc198d\") " pod="openshift-must-gather-rs27s/crc-debug-fjdms" Sep 30 19:06:16 crc kubenswrapper[4772]: I0930 19:06:16.081142 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9115c584-1aa7-4826-9763-d172f3fc198d-host\") pod \"crc-debug-fjdms\" (UID: \"9115c584-1aa7-4826-9763-d172f3fc198d\") " pod="openshift-must-gather-rs27s/crc-debug-fjdms" Sep 30 19:06:16 crc kubenswrapper[4772]: I0930 19:06:16.081330 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9115c584-1aa7-4826-9763-d172f3fc198d-host\") pod \"crc-debug-fjdms\" (UID: \"9115c584-1aa7-4826-9763-d172f3fc198d\") " pod="openshift-must-gather-rs27s/crc-debug-fjdms" Sep 30 19:06:16 crc kubenswrapper[4772]: I0930 19:06:16.098079 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhxl\" (UniqueName: \"kubernetes.io/projected/9115c584-1aa7-4826-9763-d172f3fc198d-kube-api-access-8nhxl\") pod \"crc-debug-fjdms\" (UID: \"9115c584-1aa7-4826-9763-d172f3fc198d\") " pod="openshift-must-gather-rs27s/crc-debug-fjdms" Sep 30 19:06:16 crc kubenswrapper[4772]: I0930 19:06:16.170241 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-fjdms" Sep 30 19:06:16 crc kubenswrapper[4772]: I0930 19:06:16.520844 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/crc-debug-fjdms" event={"ID":"9115c584-1aa7-4826-9763-d172f3fc198d","Type":"ContainerStarted","Data":"b7fffcb7c9039101e31b28f6aecf595248ae5b3313f0f2951fe724237cd3ce2a"} Sep 30 19:06:16 crc kubenswrapper[4772]: I0930 19:06:16.521350 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/crc-debug-fjdms" event={"ID":"9115c584-1aa7-4826-9763-d172f3fc198d","Type":"ContainerStarted","Data":"a8b5483a26505fd124271cd89f1fe7dbdbf2fe916c1be5c07adb0a2a0df134c8"} Sep 30 19:06:16 crc kubenswrapper[4772]: I0930 19:06:16.537228 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rs27s/crc-debug-fjdms" podStartSLOduration=1.537207845 podStartE2EDuration="1.537207845s" podCreationTimestamp="2025-09-30 19:06:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:16.532581164 +0000 UTC m=+7477.439594005" watchObservedRunningTime="2025-09-30 19:06:16.537207845 +0000 UTC m=+7477.444220676" Sep 30 19:06:17 crc kubenswrapper[4772]: I0930 19:06:17.534641 4772 generic.go:334] "Generic (PLEG): container finished" podID="9115c584-1aa7-4826-9763-d172f3fc198d" containerID="b7fffcb7c9039101e31b28f6aecf595248ae5b3313f0f2951fe724237cd3ce2a" exitCode=0 Sep 30 19:06:17 crc kubenswrapper[4772]: I0930 19:06:17.534731 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/crc-debug-fjdms" event={"ID":"9115c584-1aa7-4826-9763-d172f3fc198d","Type":"ContainerDied","Data":"b7fffcb7c9039101e31b28f6aecf595248ae5b3313f0f2951fe724237cd3ce2a"} Sep 30 19:06:18 crc kubenswrapper[4772]: I0930 19:06:18.674396 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-fjdms" Sep 30 19:06:18 crc kubenswrapper[4772]: I0930 19:06:18.752268 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nhxl\" (UniqueName: \"kubernetes.io/projected/9115c584-1aa7-4826-9763-d172f3fc198d-kube-api-access-8nhxl\") pod \"9115c584-1aa7-4826-9763-d172f3fc198d\" (UID: \"9115c584-1aa7-4826-9763-d172f3fc198d\") " Sep 30 19:06:18 crc kubenswrapper[4772]: I0930 19:06:18.758292 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9115c584-1aa7-4826-9763-d172f3fc198d-host\") pod \"9115c584-1aa7-4826-9763-d172f3fc198d\" (UID: \"9115c584-1aa7-4826-9763-d172f3fc198d\") " Sep 30 19:06:18 crc kubenswrapper[4772]: I0930 19:06:18.758458 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9115c584-1aa7-4826-9763-d172f3fc198d-host" (OuterVolumeSpecName: "host") pod "9115c584-1aa7-4826-9763-d172f3fc198d" (UID: "9115c584-1aa7-4826-9763-d172f3fc198d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:06:18 crc kubenswrapper[4772]: I0930 19:06:18.759398 4772 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9115c584-1aa7-4826-9763-d172f3fc198d-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:18 crc kubenswrapper[4772]: I0930 19:06:18.771576 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9115c584-1aa7-4826-9763-d172f3fc198d-kube-api-access-8nhxl" (OuterVolumeSpecName: "kube-api-access-8nhxl") pod "9115c584-1aa7-4826-9763-d172f3fc198d" (UID: "9115c584-1aa7-4826-9763-d172f3fc198d"). InnerVolumeSpecName "kube-api-access-8nhxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:18 crc kubenswrapper[4772]: I0930 19:06:18.861229 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nhxl\" (UniqueName: \"kubernetes.io/projected/9115c584-1aa7-4826-9763-d172f3fc198d-kube-api-access-8nhxl\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:19 crc kubenswrapper[4772]: I0930 19:06:19.563579 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/crc-debug-fjdms" event={"ID":"9115c584-1aa7-4826-9763-d172f3fc198d","Type":"ContainerDied","Data":"a8b5483a26505fd124271cd89f1fe7dbdbf2fe916c1be5c07adb0a2a0df134c8"} Sep 30 19:06:19 crc kubenswrapper[4772]: I0930 19:06:19.563636 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8b5483a26505fd124271cd89f1fe7dbdbf2fe916c1be5c07adb0a2a0df134c8" Sep 30 19:06:19 crc kubenswrapper[4772]: I0930 19:06:19.563715 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-fjdms" Sep 30 19:06:25 crc kubenswrapper[4772]: I0930 19:06:25.903825 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:06:25 crc kubenswrapper[4772]: E0930 19:06:25.904818 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:06:26 crc kubenswrapper[4772]: I0930 19:06:26.947269 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rs27s/crc-debug-fjdms"] Sep 30 19:06:26 crc kubenswrapper[4772]: I0930 19:06:26.957767 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rs27s/crc-debug-fjdms"] Sep 30 19:06:27 crc kubenswrapper[4772]: I0930 19:06:27.911784 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9115c584-1aa7-4826-9763-d172f3fc198d" path="/var/lib/kubelet/pods/9115c584-1aa7-4826-9763-d172f3fc198d/volumes" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.156891 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rs27s/crc-debug-9hnq4"] Sep 30 19:06:28 crc kubenswrapper[4772]: E0930 19:06:28.157311 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9115c584-1aa7-4826-9763-d172f3fc198d" containerName="container-00" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.157324 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9115c584-1aa7-4826-9763-d172f3fc198d" containerName="container-00" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.157534 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9115c584-1aa7-4826-9763-d172f3fc198d" containerName="container-00" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.158345 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-9hnq4" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.171394 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3569c3de-8e00-4085-96df-482f53f2345c-host\") pod \"crc-debug-9hnq4\" (UID: \"3569c3de-8e00-4085-96df-482f53f2345c\") " pod="openshift-must-gather-rs27s/crc-debug-9hnq4" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.171507 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6fqb\" (UniqueName: \"kubernetes.io/projected/3569c3de-8e00-4085-96df-482f53f2345c-kube-api-access-w6fqb\") pod \"crc-debug-9hnq4\" (UID: \"3569c3de-8e00-4085-96df-482f53f2345c\") " pod="openshift-must-gather-rs27s/crc-debug-9hnq4" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.272976 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3569c3de-8e00-4085-96df-482f53f2345c-host\") pod \"crc-debug-9hnq4\" (UID: \"3569c3de-8e00-4085-96df-482f53f2345c\") " pod="openshift-must-gather-rs27s/crc-debug-9hnq4" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.273036 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6fqb\" (UniqueName: \"kubernetes.io/projected/3569c3de-8e00-4085-96df-482f53f2345c-kube-api-access-w6fqb\") pod \"crc-debug-9hnq4\" (UID: \"3569c3de-8e00-4085-96df-482f53f2345c\") " pod="openshift-must-gather-rs27s/crc-debug-9hnq4" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.273197 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3569c3de-8e00-4085-96df-482f53f2345c-host\") pod \"crc-debug-9hnq4\" (UID: \"3569c3de-8e00-4085-96df-482f53f2345c\") " pod="openshift-must-gather-rs27s/crc-debug-9hnq4" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.294035 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6fqb\" (UniqueName: \"kubernetes.io/projected/3569c3de-8e00-4085-96df-482f53f2345c-kube-api-access-w6fqb\") pod \"crc-debug-9hnq4\" (UID: \"3569c3de-8e00-4085-96df-482f53f2345c\") " pod="openshift-must-gather-rs27s/crc-debug-9hnq4" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.494211 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-9hnq4" Sep 30 19:06:28 crc kubenswrapper[4772]: I0930 19:06:28.666510 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/crc-debug-9hnq4" event={"ID":"3569c3de-8e00-4085-96df-482f53f2345c","Type":"ContainerStarted","Data":"728dd48f0ad12e430388918736393dc17dcf41a1d8c4e1c1ef634c370cf8faee"} Sep 30 19:06:29 crc kubenswrapper[4772]: I0930 19:06:29.676206 4772 generic.go:334] "Generic (PLEG): container finished" podID="3569c3de-8e00-4085-96df-482f53f2345c" containerID="e5f3e26f0c552ef606b7738f3e695d1ccf8b1bf3de485ebed677c09c8cecae8a" exitCode=0 Sep 30 19:06:29 crc kubenswrapper[4772]: I0930 19:06:29.676310 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/crc-debug-9hnq4" event={"ID":"3569c3de-8e00-4085-96df-482f53f2345c","Type":"ContainerDied","Data":"e5f3e26f0c552ef606b7738f3e695d1ccf8b1bf3de485ebed677c09c8cecae8a"} Sep 30 19:06:29 crc kubenswrapper[4772]: I0930 19:06:29.718648 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rs27s/crc-debug-9hnq4"] Sep 30 19:06:29 crc kubenswrapper[4772]: I0930 19:06:29.727524 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rs27s/crc-debug-9hnq4"] Sep 30 19:06:30 crc kubenswrapper[4772]: I0930 19:06:30.811010 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-9hnq4" Sep 30 19:06:30 crc kubenswrapper[4772]: I0930 19:06:30.934009 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6fqb\" (UniqueName: \"kubernetes.io/projected/3569c3de-8e00-4085-96df-482f53f2345c-kube-api-access-w6fqb\") pod \"3569c3de-8e00-4085-96df-482f53f2345c\" (UID: \"3569c3de-8e00-4085-96df-482f53f2345c\") " Sep 30 19:06:30 crc kubenswrapper[4772]: I0930 19:06:30.934550 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3569c3de-8e00-4085-96df-482f53f2345c-host\") pod \"3569c3de-8e00-4085-96df-482f53f2345c\" (UID: \"3569c3de-8e00-4085-96df-482f53f2345c\") " Sep 30 19:06:30 crc kubenswrapper[4772]: I0930 19:06:30.934742 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3569c3de-8e00-4085-96df-482f53f2345c-host" (OuterVolumeSpecName: "host") pod "3569c3de-8e00-4085-96df-482f53f2345c" (UID: "3569c3de-8e00-4085-96df-482f53f2345c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:06:30 crc kubenswrapper[4772]: I0930 19:06:30.935297 4772 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3569c3de-8e00-4085-96df-482f53f2345c-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:30 crc kubenswrapper[4772]: I0930 19:06:30.945495 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3569c3de-8e00-4085-96df-482f53f2345c-kube-api-access-w6fqb" (OuterVolumeSpecName: "kube-api-access-w6fqb") pod "3569c3de-8e00-4085-96df-482f53f2345c" (UID: "3569c3de-8e00-4085-96df-482f53f2345c"). InnerVolumeSpecName "kube-api-access-w6fqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:31 crc kubenswrapper[4772]: I0930 19:06:31.037563 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6fqb\" (UniqueName: \"kubernetes.io/projected/3569c3de-8e00-4085-96df-482f53f2345c-kube-api-access-w6fqb\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:31 crc kubenswrapper[4772]: I0930 19:06:31.468424 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-tzz8m_80d5010e-a767-491b-bcb2-89272762a121/kube-rbac-proxy/0.log" Sep 30 19:06:31 crc kubenswrapper[4772]: I0930 19:06:31.525286 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-tzz8m_80d5010e-a767-491b-bcb2-89272762a121/manager/0.log" Sep 30 19:06:31 crc kubenswrapper[4772]: I0930 19:06:31.716425 4772 scope.go:117] "RemoveContainer" containerID="e5f3e26f0c552ef606b7738f3e695d1ccf8b1bf3de485ebed677c09c8cecae8a" Sep 30 19:06:31 crc kubenswrapper[4772]: I0930 19:06:31.716786 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/crc-debug-9hnq4" Sep 30 19:06:31 crc kubenswrapper[4772]: I0930 19:06:31.776934 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-n7w7p_ad2965ed-ed78-4646-97ae-07cce49e8eb1/kube-rbac-proxy/0.log" Sep 30 19:06:31 crc kubenswrapper[4772]: I0930 19:06:31.877605 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-n7w7p_ad2965ed-ed78-4646-97ae-07cce49e8eb1/manager/0.log" Sep 30 19:06:31 crc kubenswrapper[4772]: I0930 19:06:31.910500 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3569c3de-8e00-4085-96df-482f53f2345c" path="/var/lib/kubelet/pods/3569c3de-8e00-4085-96df-482f53f2345c/volumes" Sep 30 19:06:31 crc kubenswrapper[4772]: I0930 19:06:31.939157 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/util/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.141588 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/util/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.150393 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/pull/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.170510 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/pull/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.381800 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/pull/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.385860 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/util/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.392321 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2dacaafde932355be6b0389d44caaa794cd7ed27d7c2ca1cc33b02bb0d45jz_737191b4-9bb7-402d-bbdb-603bae58da8a/extract/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.573828 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-rqv96_832335d3-7446-4879-8ec1-8f24d6d3708a/manager/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.574812 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-rqv96_832335d3-7446-4879-8ec1-8f24d6d3708a/kube-rbac-proxy/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.698689 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-ghtj2_314c8eb1-ee8d-405d-9bb6-a74de21c2f01/kube-rbac-proxy/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.846736 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-b82x8_27e94b49-6017-4790-af32-61cdb6c41f2c/kube-rbac-proxy/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.877669 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-ghtj2_314c8eb1-ee8d-405d-9bb6-a74de21c2f01/manager/0.log" Sep 30 19:06:32 crc kubenswrapper[4772]: I0930 19:06:32.965615 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-b82x8_27e94b49-6017-4790-af32-61cdb6c41f2c/manager/0.log" Sep 30 19:06:33 crc kubenswrapper[4772]: I0930 19:06:33.053221 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-dn2kt_b7ba1160-070d-4cc4-9c53-75817bd6141e/kube-rbac-proxy/0.log" Sep 30 19:06:33 crc kubenswrapper[4772]: I0930 19:06:33.090867 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-dn2kt_b7ba1160-070d-4cc4-9c53-75817bd6141e/manager/0.log" Sep 30 19:06:33 crc kubenswrapper[4772]: I0930 19:06:33.259971 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-shhhk_058eec37-9f59-4fc5-8fa3-c9595bf58300/kube-rbac-proxy/0.log" Sep 30 19:06:33 crc kubenswrapper[4772]: I0930 19:06:33.511872 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-shhhk_058eec37-9f59-4fc5-8fa3-c9595bf58300/manager/0.log" Sep 30 19:06:33 crc kubenswrapper[4772]: I0930 19:06:33.531603 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-vdhkv_d10d7495-42f5-4919-8985-99913d62ab28/manager/0.log" Sep 30 19:06:33 crc kubenswrapper[4772]: I0930 19:06:33.535518 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-vdhkv_d10d7495-42f5-4919-8985-99913d62ab28/kube-rbac-proxy/0.log" Sep 30 19:06:33 crc kubenswrapper[4772]: I0930 19:06:33.712821 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-56vtc_44aed112-2ebc-48b6-b3b4-9a47d2dafaa9/kube-rbac-proxy/0.log" Sep 30 19:06:34 crc kubenswrapper[4772]: I0930 19:06:34.131225 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-56vtc_44aed112-2ebc-48b6-b3b4-9a47d2dafaa9/manager/0.log" Sep 30 19:06:34 crc kubenswrapper[4772]: I0930 19:06:34.146626 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-2fx6p_c886af64-f9cc-4127-9d17-3007ae492d06/kube-rbac-proxy/0.log" Sep 30 19:06:34 crc kubenswrapper[4772]: I0930 19:06:34.241609 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-2fx6p_c886af64-f9cc-4127-9d17-3007ae492d06/manager/0.log" Sep 30 19:06:34 crc kubenswrapper[4772]: I0930 19:06:34.341885 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-vlpqf_1753608a-67af-4fa4-83f1-3f7d1623fc6b/kube-rbac-proxy/0.log" Sep 30 19:06:34 crc kubenswrapper[4772]: I0930 19:06:34.488151 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-vlpqf_1753608a-67af-4fa4-83f1-3f7d1623fc6b/manager/0.log" Sep 30 19:06:34 crc kubenswrapper[4772]: I0930 19:06:34.533480 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-smllw_51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc/kube-rbac-proxy/0.log" Sep 30 19:06:34 crc kubenswrapper[4772]: I0930 19:06:34.618862 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-smllw_51e19ebe-c84b-4e1e-bf1a-fb09a03e3edc/manager/0.log" Sep 30 19:06:34 crc kubenswrapper[4772]: I0930 19:06:34.754208 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-z472f_df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff/kube-rbac-proxy/0.log" Sep 30 19:06:34 crc kubenswrapper[4772]: I0930 19:06:34.894436 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-z472f_df0c81ba-0648-4c3a-9ff0-7c5f5d8251ff/manager/0.log" Sep 30 19:06:35 crc kubenswrapper[4772]: I0930 19:06:35.007840 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-kpr6v_13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36/manager/0.log" Sep 30 19:06:35 crc kubenswrapper[4772]: I0930 19:06:35.035283 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-kpr6v_13e97cb1-f6e1-4f9d-bd3f-47292b0b5a36/kube-rbac-proxy/0.log" Sep 30 19:06:35 crc kubenswrapper[4772]: I0930 19:06:35.144402 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-dxbpz_69e18d49-1290-4440-a3c9-885352fa18c5/kube-rbac-proxy/0.log" Sep 30 19:06:35 crc kubenswrapper[4772]: I0930 19:06:35.233108 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-dxbpz_69e18d49-1290-4440-a3c9-885352fa18c5/manager/0.log" Sep 30 19:06:35 crc kubenswrapper[4772]: I0930 19:06:35.329907 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dd9b5767f-p4n9f_d5be7b91-f881-4cd5-878e-1d40a94a3a8d/kube-rbac-proxy/0.log" Sep 30 19:06:35 crc kubenswrapper[4772]: I0930 19:06:35.557165 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5959786844-tbxrx_6d89b985-cd07-43bc-9024-ff6ffd1adc45/kube-rbac-proxy/0.log" Sep 30 19:06:35 crc kubenswrapper[4772]: I0930 19:06:35.753548 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rg4rc_04896286-2a65-451e-8639-d0f12941e991/registry-server/0.log" Sep 30 19:06:35 crc kubenswrapper[4772]: I0930 19:06:35.817474 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5959786844-tbxrx_6d89b985-cd07-43bc-9024-ff6ffd1adc45/operator/0.log" Sep 30 19:06:35 crc kubenswrapper[4772]: I0930 19:06:35.837575 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-8xbv5_6c9f85e1-5df7-4943-9064-69af6e200e82/kube-rbac-proxy/0.log" Sep 30 19:06:36 crc kubenswrapper[4772]: I0930 19:06:36.068255 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-mfnlh_f3a0e5a3-c50e-48ce-801d-f7916210165b/kube-rbac-proxy/0.log" Sep 30 19:06:36 crc kubenswrapper[4772]: I0930 19:06:36.151323 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-8xbv5_6c9f85e1-5df7-4943-9064-69af6e200e82/manager/0.log" Sep 30 19:06:36 crc kubenswrapper[4772]: I0930 19:06:36.206475 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-mfnlh_f3a0e5a3-c50e-48ce-801d-f7916210165b/manager/0.log" Sep 30 19:06:36 crc kubenswrapper[4772]: I0930 19:06:36.440475 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-swgvc_1e8f518a-f6a2-4bfc-a4ed-d6580a97f55f/operator/0.log" Sep 30 19:06:36 crc kubenswrapper[4772]: I0930 19:06:36.466215 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-6m9mb_f8af3992-c401-4dea-b5a5-92063a05384e/kube-rbac-proxy/0.log" Sep 30 19:06:36 crc kubenswrapper[4772]: I0930 19:06:36.651038 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-6m9mb_f8af3992-c401-4dea-b5a5-92063a05384e/manager/0.log" Sep 30 19:06:36 crc kubenswrapper[4772]: I0930 19:06:36.654780 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-b6np7_d4295a68-a2dc-4b0b-a577-bbd6448d3a70/kube-rbac-proxy/0.log" Sep 30 19:06:36 crc kubenswrapper[4772]: I0930 19:06:36.901736 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-xmwpp_5b10f12b-b24a-4cf6-b07b-7b3e811ccd30/kube-rbac-proxy/0.log" Sep 30 19:06:36 crc kubenswrapper[4772]: I0930 19:06:36.994480 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-xmwpp_5b10f12b-b24a-4cf6-b07b-7b3e811ccd30/manager/0.log" Sep 30 19:06:37 crc kubenswrapper[4772]: I0930 19:06:37.115910 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-b6np7_d4295a68-a2dc-4b0b-a577-bbd6448d3a70/manager/0.log" Sep 30 19:06:37 crc kubenswrapper[4772]: I0930 19:06:37.120386 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-86c75f6bd4-4fnzg_4fcd6b42-8644-41f5-bd3b-51184d34cd00/kube-rbac-proxy/0.log" Sep 30 19:06:37 crc kubenswrapper[4772]: I0930 19:06:37.136449 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dd9b5767f-p4n9f_d5be7b91-f881-4cd5-878e-1d40a94a3a8d/manager/0.log" Sep 30 19:06:37 crc kubenswrapper[4772]: I0930 19:06:37.321079 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-86c75f6bd4-4fnzg_4fcd6b42-8644-41f5-bd3b-51184d34cd00/manager/0.log" Sep 30 19:06:37 crc kubenswrapper[4772]: I0930 19:06:37.898814 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:06:37 crc kubenswrapper[4772]: E0930 19:06:37.899143 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:06:48 crc kubenswrapper[4772]: I0930 19:06:48.898855 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:06:48 crc kubenswrapper[4772]: E0930 19:06:48.899934 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:06:53 crc kubenswrapper[4772]: I0930 19:06:53.885858 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gdmvr_0a900ef2-a0f1-4a8b-b33a-7316c70cbaa9/control-plane-machine-set-operator/0.log" Sep 30 19:06:54 crc kubenswrapper[4772]: I0930 19:06:54.037250 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-klzl8_b023c669-cb19-4010-b9d7-120bdfff87bd/kube-rbac-proxy/0.log" Sep 30 19:06:54 crc kubenswrapper[4772]: I0930 19:06:54.065806 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-klzl8_b023c669-cb19-4010-b9d7-120bdfff87bd/machine-api-operator/0.log" Sep 30 19:07:03 crc kubenswrapper[4772]: I0930 19:07:03.898484 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:07:03 crc kubenswrapper[4772]: E0930 19:07:03.899366 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:07:06 crc kubenswrapper[4772]: I0930 19:07:06.501458 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qtth7_75f76096-4236-46b9-8e3b-9e6784362607/cert-manager-controller/0.log" Sep 30 19:07:06 crc kubenswrapper[4772]: I0930 19:07:06.703231 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-ws7dg_525413d1-592e-482c-a45a-0e88bfc94da5/cert-manager-cainjector/0.log" Sep 30 19:07:06 crc kubenswrapper[4772]: I0930 19:07:06.773425 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-lkvcc_9e9dcd73-971e-4f2f-869a-317159d2c9a5/cert-manager-webhook/0.log" Sep 30 19:07:15 crc kubenswrapper[4772]: I0930 19:07:15.898625 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:07:15 crc kubenswrapper[4772]: E0930 19:07:15.899638 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:07:20 crc kubenswrapper[4772]: I0930 19:07:20.219401 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-m6pgg_f076e40b-6b99-4a23-8235-c008e4a209c5/nmstate-console-plugin/0.log" Sep 30 19:07:20 crc kubenswrapper[4772]: I0930 19:07:20.429853 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-pf6qn_477c7640-d169-487f-a2d7-9164f8b26417/nmstate-handler/0.log" Sep 30 19:07:20 crc kubenswrapper[4772]: I0930 19:07:20.522379 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-sll7d_aed9b88e-7f1b-472d-a22c-ebf719c71f73/nmstate-metrics/0.log" Sep 30 19:07:20 crc kubenswrapper[4772]: I0930 19:07:20.523261 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-sll7d_aed9b88e-7f1b-472d-a22c-ebf719c71f73/kube-rbac-proxy/0.log" Sep 30 19:07:20 crc kubenswrapper[4772]: I0930 19:07:20.716722 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-qds72_67cdd39b-a0de-4d14-ba2f-2419b31983da/nmstate-operator/0.log" Sep 30 19:07:20 crc kubenswrapper[4772]: I0930 19:07:20.728692 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-kphv7_4febaade-1298-413f-8f68-ca4771613783/nmstate-webhook/0.log" Sep 30 19:07:27 crc kubenswrapper[4772]: I0930 19:07:27.899410 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:07:27 crc kubenswrapper[4772]: E0930 19:07:27.900572 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.060185 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-pbhmq_a7c392dd-0528-44c0-8fa6-85d8c33a4ac4/kube-rbac-proxy/0.log" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.286778 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-frr-files/0.log" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.296403 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-pbhmq_a7c392dd-0528-44c0-8fa6-85d8c33a4ac4/controller/0.log" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.516616 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-metrics/0.log" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.526258 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-reloader/0.log" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.526449 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-reloader/0.log" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.570142 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-frr-files/0.log" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.774709 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-metrics/0.log" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.819539 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-frr-files/0.log" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.819545 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-metrics/0.log" Sep 30 19:07:37 crc kubenswrapper[4772]: I0930 19:07:37.833442 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-reloader/0.log" Sep 30 19:07:38 crc kubenswrapper[4772]: I0930 19:07:38.013597 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-frr-files/0.log" Sep 30 19:07:38 crc kubenswrapper[4772]: I0930 19:07:38.046742 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-reloader/0.log" Sep 30 19:07:38 crc kubenswrapper[4772]: I0930 19:07:38.076519 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/cp-metrics/0.log" Sep 30 19:07:38 crc kubenswrapper[4772]: I0930 19:07:38.103117 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/controller/0.log" Sep 30 19:07:38 crc kubenswrapper[4772]: I0930 19:07:38.313597 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/frr-metrics/0.log" Sep 30 19:07:38 crc kubenswrapper[4772]: I0930 19:07:38.509453 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/kube-rbac-proxy-frr/0.log" Sep 30 19:07:38 crc kubenswrapper[4772]: I0930 19:07:38.542624 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/kube-rbac-proxy/0.log" Sep 30 19:07:38 crc kubenswrapper[4772]: I0930 19:07:38.889678 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/reloader/0.log" Sep 30 19:07:38 crc kubenswrapper[4772]: I0930 19:07:38.922479 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-q2g5l_90506d56-68ff-4821-9594-0bfaa2ef2b57/frr-k8s-webhook-server/0.log" Sep 30 19:07:39 crc kubenswrapper[4772]: I0930 19:07:39.208719 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cbbfcbbd-w9mxx_d8b0a4f0-a6d9-46ff-9487-98fec1d43e07/manager/0.log" Sep 30 19:07:39 crc kubenswrapper[4772]: I0930 19:07:39.433574 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7f67d8696d-jl7tp_fe6335cc-f638-4411-85e6-bf6beea1f24f/webhook-server/0.log" Sep 30 19:07:39 crc kubenswrapper[4772]: I0930 19:07:39.512849 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kd7pn_d1d9e7ba-297f-4ef1-913a-afb210b83c2a/kube-rbac-proxy/0.log" Sep 30 19:07:40 crc kubenswrapper[4772]: I0930 19:07:40.148179 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f68tb_dbe8dedf-164d-43b2-9b38-4abcae7fb3e5/frr/0.log" Sep 30 19:07:40 crc kubenswrapper[4772]: I0930 19:07:40.272162 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kd7pn_d1d9e7ba-297f-4ef1-913a-afb210b83c2a/speaker/0.log" Sep 30 19:07:42 crc kubenswrapper[4772]: I0930 19:07:42.898617 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:07:42 crc kubenswrapper[4772]: E0930 19:07:42.899410 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:07:52 crc kubenswrapper[4772]: I0930 19:07:52.319406 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/util/0.log" Sep 30 19:07:52 crc kubenswrapper[4772]: I0930 19:07:52.523290 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/util/0.log" Sep 30 19:07:52 crc kubenswrapper[4772]: I0930 19:07:52.541227 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/pull/0.log" Sep 30 19:07:52 crc kubenswrapper[4772]: I0930 19:07:52.558872 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/pull/0.log" Sep 30 19:07:52 crc kubenswrapper[4772]: I0930 19:07:52.752840 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/util/0.log" Sep 30 19:07:52 crc kubenswrapper[4772]: I0930 19:07:52.763411 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/extract/0.log" Sep 30 19:07:52 crc kubenswrapper[4772]: I0930 19:07:52.774779 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjvk7p_3311e11b-7e62-409e-95e9-88528c9bffbb/pull/0.log" Sep 30 19:07:52 crc kubenswrapper[4772]: I0930 19:07:52.917755 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/util/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.070952 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/pull/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.100393 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/util/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.100927 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/pull/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.266004 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/util/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.272321 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/extract/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.297207 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6rbgh_88994732-fb76-4b3b-aff1-9f27baea5f53/pull/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.440392 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-utilities/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.631455 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-content/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.632357 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-content/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.639430 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-utilities/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.769177 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-utilities/0.log" Sep 30 19:07:53 crc kubenswrapper[4772]: I0930 19:07:53.824420 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/extract-content/0.log" Sep 30 19:07:54 crc kubenswrapper[4772]: I0930 19:07:54.017436 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-utilities/0.log" Sep 30 19:07:54 crc kubenswrapper[4772]: I0930 19:07:54.280357 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-content/0.log" Sep 30 19:07:54 crc kubenswrapper[4772]: I0930 19:07:54.280392 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-utilities/0.log" Sep 30 19:07:54 crc kubenswrapper[4772]: I0930 19:07:54.317216 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-content/0.log" Sep 30 19:07:54 crc kubenswrapper[4772]: I0930 19:07:54.521018 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-content/0.log" Sep 30 19:07:54 crc kubenswrapper[4772]: I0930 19:07:54.650302 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/extract-utilities/0.log" Sep 30 19:07:54 crc kubenswrapper[4772]: I0930 19:07:54.899991 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:07:54 crc kubenswrapper[4772]: E0930 19:07:54.903265 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:07:54 crc kubenswrapper[4772]: I0930 19:07:54.912109 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/util/0.log" Sep 30 19:07:55 crc kubenswrapper[4772]: I0930 19:07:55.137444 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/pull/0.log" Sep 30 19:07:55 crc kubenswrapper[4772]: I0930 19:07:55.186770 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7knsg_1522b5bf-cf61-4f95-a15c-63245f3eab54/registry-server/0.log" Sep 30 19:07:55 crc kubenswrapper[4772]: I0930 19:07:55.192093 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/util/0.log" Sep 30 19:07:55 crc kubenswrapper[4772]: I0930 19:07:55.408363 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/pull/0.log" Sep 30 19:07:55 crc kubenswrapper[4772]: I0930 19:07:55.541681 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/util/0.log" Sep 30 19:07:55 crc kubenswrapper[4772]: I0930 19:07:55.631650 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/pull/0.log" Sep 30 19:07:55 crc kubenswrapper[4772]: I0930 19:07:55.690261 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9678g5w_d4f52924-d141-4724-838f-d3bfd6dab358/extract/0.log" Sep 30 19:07:55 crc kubenswrapper[4772]: I0930 19:07:55.931093 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8qmk7_aae4ed0a-da1e-4581-913e-1c3c8c1554cc/marketplace-operator/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.033931 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cl5mx_a71ddc7a-9a49-4cd9-842a-c6f24957a6e3/registry-server/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.059575 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-utilities/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.230932 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-utilities/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.236628 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-content/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.241276 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-content/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.484524 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-utilities/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.491263 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-content/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.491894 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/extract-utilities/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.669857 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22cvm_c7d4b164-e082-47f7-ab01-643d7bb3788b/registry-server/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.708645 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-content/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.727744 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-utilities/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.745588 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-content/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.909177 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-content/0.log" Sep 30 19:07:56 crc kubenswrapper[4772]: I0930 19:07:56.931155 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/extract-utilities/0.log" Sep 30 19:07:57 crc kubenswrapper[4772]: I0930 19:07:57.785793 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4zvrj_ee561213-a3c6-4429-9f8d-f670a07494c5/registry-server/0.log" Sep 30 19:08:08 crc kubenswrapper[4772]: I0930 19:08:08.281330 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-qkggk_3cb2995b-6088-4762-8e3b-d99d0eaf03ed/prometheus-operator/0.log" Sep 30 19:08:08 crc kubenswrapper[4772]: I0930 19:08:08.465636 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b79d4788f-drcw5_167ceeed-fcd1-409a-b655-f17da9529300/prometheus-operator-admission-webhook/0.log" Sep 30 19:08:08 crc kubenswrapper[4772]: I0930 19:08:08.494323 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b79d4788f-zhwfn_9acc0016-89fe-4a76-a443-b19b593dc666/prometheus-operator-admission-webhook/0.log" Sep 30 19:08:08 crc kubenswrapper[4772]: I0930 19:08:08.641216 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-zm69t_4ce80066-009d-4bb9-8a33-dcb521b0e08c/operator/0.log" Sep 30 19:08:08 crc kubenswrapper[4772]: I0930 19:08:08.683440 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-b9gwr_28b404db-1018-43c7-bdba-e2b0d97e1a8c/perses-operator/0.log" Sep 30 19:08:09 crc kubenswrapper[4772]: I0930 19:08:09.904565 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:08:09 crc kubenswrapper[4772]: E0930 19:08:09.906206 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:08:22 crc kubenswrapper[4772]: I0930 19:08:22.898843 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:08:22 crc kubenswrapper[4772]: E0930 19:08:22.900584 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.103910 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hgt57"] Sep 30 19:08:24 crc kubenswrapper[4772]: E0930 19:08:24.105328 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3569c3de-8e00-4085-96df-482f53f2345c" containerName="container-00" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.105345 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3569c3de-8e00-4085-96df-482f53f2345c" containerName="container-00" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.105556 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3569c3de-8e00-4085-96df-482f53f2345c" containerName="container-00" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.112041 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.161150 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgt57"] Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.269046 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-utilities\") pod \"community-operators-hgt57\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.269164 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-catalog-content\") pod \"community-operators-hgt57\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.269206 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfj9t\" (UniqueName: \"kubernetes.io/projected/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-kube-api-access-sfj9t\") pod \"community-operators-hgt57\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.371312 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-catalog-content\") pod \"community-operators-hgt57\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.371687 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfj9t\" (UniqueName: \"kubernetes.io/projected/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-kube-api-access-sfj9t\") pod \"community-operators-hgt57\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.372097 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-catalog-content\") pod \"community-operators-hgt57\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.372279 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-utilities\") pod \"community-operators-hgt57\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.372878 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-utilities\") pod \"community-operators-hgt57\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.395940 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfj9t\" (UniqueName: \"kubernetes.io/projected/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-kube-api-access-sfj9t\") pod \"community-operators-hgt57\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:24 crc kubenswrapper[4772]: I0930 19:08:24.463029 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:25 crc kubenswrapper[4772]: I0930 19:08:25.090788 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgt57"] Sep 30 19:08:25 crc kubenswrapper[4772]: I0930 19:08:25.949260 4772 generic.go:334] "Generic (PLEG): container finished" podID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerID="373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c" exitCode=0 Sep 30 19:08:25 crc kubenswrapper[4772]: I0930 19:08:25.949567 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgt57" event={"ID":"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2","Type":"ContainerDied","Data":"373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c"} Sep 30 19:08:25 crc kubenswrapper[4772]: I0930 19:08:25.949592 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgt57" event={"ID":"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2","Type":"ContainerStarted","Data":"3900ef0fc61413786261bf0f3e05a6ec46e40fe437979c7b99f7b35049d2528a"} Sep 30 19:08:25 crc kubenswrapper[4772]: I0930 19:08:25.952038 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:08:27 crc kubenswrapper[4772]: I0930 19:08:27.968766 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgt57" event={"ID":"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2","Type":"ContainerStarted","Data":"306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9"} Sep 30 19:08:28 crc kubenswrapper[4772]: I0930 19:08:28.984606 4772 generic.go:334] "Generic (PLEG): container finished" podID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerID="306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9" exitCode=0 Sep 30 19:08:28 crc kubenswrapper[4772]: I0930 19:08:28.984751 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgt57" event={"ID":"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2","Type":"ContainerDied","Data":"306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9"} Sep 30 19:08:30 crc kubenswrapper[4772]: I0930 19:08:29.999911 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgt57" event={"ID":"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2","Type":"ContainerStarted","Data":"905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac"} Sep 30 19:08:30 crc kubenswrapper[4772]: I0930 19:08:30.034093 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hgt57" podStartSLOduration=2.472093089 podStartE2EDuration="6.034024912s" podCreationTimestamp="2025-09-30 19:08:24 +0000 UTC" firstStartedPulling="2025-09-30 19:08:25.951846643 +0000 UTC m=+7606.858859474" lastFinishedPulling="2025-09-30 19:08:29.513778466 +0000 UTC m=+7610.420791297" observedRunningTime="2025-09-30 19:08:30.027220165 +0000 UTC m=+7610.934232996" watchObservedRunningTime="2025-09-30 19:08:30.034024912 +0000 UTC m=+7610.941037743" Sep 30 19:08:34 crc kubenswrapper[4772]: I0930 19:08:34.465298 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:34 crc kubenswrapper[4772]: I0930 19:08:34.466069 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:34 crc kubenswrapper[4772]: I0930 19:08:34.556971 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:35 crc kubenswrapper[4772]: I0930 19:08:35.133962 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:35 crc kubenswrapper[4772]: I0930 19:08:35.203931 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgt57"] Sep 30 19:08:36 crc kubenswrapper[4772]: I0930 19:08:36.898691 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:08:36 crc kubenswrapper[4772]: E0930 19:08:36.899564 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rkhll_openshift-machine-config-operator(8e885147-8bd5-4c7a-9331-ec1f4eebd3f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.080297 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hgt57" podUID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerName="registry-server" containerID="cri-o://905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac" gracePeriod=2 Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.594493 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.751554 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-catalog-content\") pod \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.751883 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-utilities\") pod \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.752094 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfj9t\" (UniqueName: \"kubernetes.io/projected/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-kube-api-access-sfj9t\") pod \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\" (UID: \"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2\") " Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.753113 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-utilities" (OuterVolumeSpecName: "utilities") pod "dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" (UID: "dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.771520 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-kube-api-access-sfj9t" (OuterVolumeSpecName: "kube-api-access-sfj9t") pod "dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" (UID: "dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2"). InnerVolumeSpecName "kube-api-access-sfj9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.820981 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" (UID: "dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.854667 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfj9t\" (UniqueName: \"kubernetes.io/projected/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-kube-api-access-sfj9t\") on node \"crc\" DevicePath \"\"" Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.854710 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:08:37 crc kubenswrapper[4772]: I0930 19:08:37.854765 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.091435 4772 generic.go:334] "Generic (PLEG): container finished" podID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerID="905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac" exitCode=0 Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.091487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgt57" event={"ID":"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2","Type":"ContainerDied","Data":"905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac"} Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.091544 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgt57" event={"ID":"dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2","Type":"ContainerDied","Data":"3900ef0fc61413786261bf0f3e05a6ec46e40fe437979c7b99f7b35049d2528a"} Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.091566 4772 scope.go:117] "RemoveContainer" containerID="905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac" Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.091563 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgt57" Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.137374 4772 scope.go:117] "RemoveContainer" containerID="306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9" Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.140890 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgt57"] Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.149390 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hgt57"] Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.200487 4772 scope.go:117] "RemoveContainer" containerID="373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c" Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.226241 4772 scope.go:117] "RemoveContainer" containerID="905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac" Sep 30 19:08:38 crc kubenswrapper[4772]: E0930 19:08:38.226746 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac\": container with ID starting with 905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac not found: ID does not exist" containerID="905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac" Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.226776 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac"} err="failed to get container status \"905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac\": rpc error: code = NotFound desc = could not find container \"905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac\": container with ID starting with 905be4db992251725e0f8e363c611292fd3581c6d190035e219821511cfa63ac not found: ID does not exist" Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.226796 4772 scope.go:117] "RemoveContainer" containerID="306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9" Sep 30 19:08:38 crc kubenswrapper[4772]: E0930 19:08:38.227158 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9\": container with ID starting with 306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9 not found: ID does not exist" containerID="306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9" Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.227181 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9"} err="failed to get container status \"306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9\": rpc error: code = NotFound desc = could not find container \"306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9\": container with ID starting with 306fbfb0aa80488bce6a72a2df575a3ecc319aed146d1290f681978c37b1d3c9 not found: ID does not exist" Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.227193 4772 scope.go:117] "RemoveContainer" containerID="373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c" Sep 30 19:08:38 crc kubenswrapper[4772]: E0930 19:08:38.227502 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c\": container with ID starting with 373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c not found: ID does not exist" containerID="373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c" Sep 30 19:08:38 crc kubenswrapper[4772]: I0930 19:08:38.227522 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c"} err="failed to get container status \"373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c\": rpc error: code = NotFound desc = could not find container \"373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c\": container with ID starting with 373ec8dd81c44fb9c34233d6a2f88aa77b42c6f4ad00d32a0cf1b51763c7470c not found: ID does not exist" Sep 30 19:08:39 crc kubenswrapper[4772]: I0930 19:08:39.944577 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" path="/var/lib/kubelet/pods/dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2/volumes" Sep 30 19:08:49 crc kubenswrapper[4772]: I0930 19:08:49.900272 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203" Sep 30 19:08:50 crc kubenswrapper[4772]: I0930 19:08:50.221911 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"30642c1dccae00e1a6c51a4309271b0abdf33cc7bfdd0e81d95f067af0f029c7"} Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.028170 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-snkch"] Sep 30 19:09:38 crc kubenswrapper[4772]: E0930 19:09:38.029711 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerName="extract-content" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.029730 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerName="extract-content" Sep 30 19:09:38 crc kubenswrapper[4772]: E0930 19:09:38.029780 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerName="extract-utilities" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.029790 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerName="extract-utilities" Sep 30 19:09:38 crc kubenswrapper[4772]: E0930 19:09:38.029812 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerName="registry-server" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.029820 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerName="registry-server" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.030136 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee92ea9-2eb3-494b-ac8a-8d3edc2eaee2" containerName="registry-server" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.032336 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.039869 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snkch"] Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.110300 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-catalog-content\") pod \"redhat-marketplace-snkch\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.110373 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-utilities\") pod \"redhat-marketplace-snkch\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.110653 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvj8j\" (UniqueName: \"kubernetes.io/projected/6e28ffc9-e728-4dbf-8742-c45081a0df7c-kube-api-access-jvj8j\") pod \"redhat-marketplace-snkch\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.213811 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-catalog-content\") pod \"redhat-marketplace-snkch\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.213880 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-utilities\") pod \"redhat-marketplace-snkch\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.213955 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvj8j\" (UniqueName: \"kubernetes.io/projected/6e28ffc9-e728-4dbf-8742-c45081a0df7c-kube-api-access-jvj8j\") pod \"redhat-marketplace-snkch\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.214440 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-catalog-content\") pod \"redhat-marketplace-snkch\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.214461 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-utilities\") pod \"redhat-marketplace-snkch\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.242710 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvj8j\" (UniqueName: \"kubernetes.io/projected/6e28ffc9-e728-4dbf-8742-c45081a0df7c-kube-api-access-jvj8j\") pod \"redhat-marketplace-snkch\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.370169 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:38 crc kubenswrapper[4772]: I0930 19:09:38.868826 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snkch"] Sep 30 19:09:38 crc kubenswrapper[4772]: W0930 19:09:38.877409 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e28ffc9_e728_4dbf_8742_c45081a0df7c.slice/crio-231656b1a462f70e6e6d00e45aa00e6024c2f97de5dfdd62befe6fefe198ebae WatchSource:0}: Error finding container 231656b1a462f70e6e6d00e45aa00e6024c2f97de5dfdd62befe6fefe198ebae: Status 404 returned error can't find the container with id 231656b1a462f70e6e6d00e45aa00e6024c2f97de5dfdd62befe6fefe198ebae Sep 30 19:09:39 crc kubenswrapper[4772]: I0930 19:09:39.815805 4772 generic.go:334] "Generic (PLEG): container finished" podID="6e28ffc9-e728-4dbf-8742-c45081a0df7c" containerID="ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d" exitCode=0 Sep 30 19:09:39 crc kubenswrapper[4772]: I0930 19:09:39.816125 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snkch" event={"ID":"6e28ffc9-e728-4dbf-8742-c45081a0df7c","Type":"ContainerDied","Data":"ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d"} Sep 30 19:09:39 crc kubenswrapper[4772]: I0930 19:09:39.816442 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snkch" event={"ID":"6e28ffc9-e728-4dbf-8742-c45081a0df7c","Type":"ContainerStarted","Data":"231656b1a462f70e6e6d00e45aa00e6024c2f97de5dfdd62befe6fefe198ebae"} Sep 30 19:09:41 crc kubenswrapper[4772]: I0930 19:09:41.837033 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snkch" event={"ID":"6e28ffc9-e728-4dbf-8742-c45081a0df7c","Type":"ContainerStarted","Data":"39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356"} Sep 30 19:09:42 crc kubenswrapper[4772]: I0930 19:09:42.853468 4772 generic.go:334] "Generic (PLEG): container finished" podID="6e28ffc9-e728-4dbf-8742-c45081a0df7c" containerID="39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356" exitCode=0 Sep 30 19:09:42 crc kubenswrapper[4772]: I0930 19:09:42.853553 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snkch" event={"ID":"6e28ffc9-e728-4dbf-8742-c45081a0df7c","Type":"ContainerDied","Data":"39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356"} Sep 30 19:09:44 crc kubenswrapper[4772]: I0930 19:09:44.892733 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snkch" event={"ID":"6e28ffc9-e728-4dbf-8742-c45081a0df7c","Type":"ContainerStarted","Data":"3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037"} Sep 30 19:09:44 crc kubenswrapper[4772]: I0930 19:09:44.920580 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-snkch" podStartSLOduration=3.006480449 podStartE2EDuration="6.920544694s" podCreationTimestamp="2025-09-30 19:09:38 +0000 UTC" firstStartedPulling="2025-09-30 19:09:39.819379833 +0000 UTC m=+7680.726392684" lastFinishedPulling="2025-09-30 19:09:43.733444078 +0000 UTC m=+7684.640456929" observedRunningTime="2025-09-30 19:09:44.918041889 +0000 UTC m=+7685.825054720" watchObservedRunningTime="2025-09-30 19:09:44.920544694 +0000 UTC m=+7685.827557535" Sep 30 19:09:48 crc kubenswrapper[4772]: I0930 19:09:48.371239 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:48 crc kubenswrapper[4772]: I0930 19:09:48.372451 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:48 crc kubenswrapper[4772]: I0930 19:09:48.420723 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:48 crc kubenswrapper[4772]: I0930 19:09:48.757511 4772 scope.go:117] "RemoveContainer" containerID="16b4182f652abda7ae9ddc4be51a1417db155c0470605b6305aa80b843643714" Sep 30 19:09:58 crc kubenswrapper[4772]: I0930 19:09:58.424315 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:58 crc kubenswrapper[4772]: I0930 19:09:58.470377 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snkch"] Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.038248 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-snkch" podUID="6e28ffc9-e728-4dbf-8742-c45081a0df7c" containerName="registry-server" containerID="cri-o://3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037" gracePeriod=2 Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.524443 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.666430 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-catalog-content\") pod \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.666870 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvj8j\" (UniqueName: \"kubernetes.io/projected/6e28ffc9-e728-4dbf-8742-c45081a0df7c-kube-api-access-jvj8j\") pod \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.666926 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-utilities\") pod \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\" (UID: \"6e28ffc9-e728-4dbf-8742-c45081a0df7c\") " Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.667621 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-utilities" (OuterVolumeSpecName: "utilities") pod "6e28ffc9-e728-4dbf-8742-c45081a0df7c" (UID: "6e28ffc9-e728-4dbf-8742-c45081a0df7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.667769 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.672640 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e28ffc9-e728-4dbf-8742-c45081a0df7c-kube-api-access-jvj8j" (OuterVolumeSpecName: "kube-api-access-jvj8j") pod "6e28ffc9-e728-4dbf-8742-c45081a0df7c" (UID: "6e28ffc9-e728-4dbf-8742-c45081a0df7c"). InnerVolumeSpecName "kube-api-access-jvj8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.679916 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e28ffc9-e728-4dbf-8742-c45081a0df7c" (UID: "6e28ffc9-e728-4dbf-8742-c45081a0df7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.770010 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvj8j\" (UniqueName: \"kubernetes.io/projected/6e28ffc9-e728-4dbf-8742-c45081a0df7c-kube-api-access-jvj8j\") on node \"crc\" DevicePath \"\"" Sep 30 19:09:59 crc kubenswrapper[4772]: I0930 19:09:59.770074 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e28ffc9-e728-4dbf-8742-c45081a0df7c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.052409 4772 generic.go:334] "Generic (PLEG): container finished" podID="6e28ffc9-e728-4dbf-8742-c45081a0df7c" containerID="3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037" exitCode=0 Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.052457 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snkch" Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.052481 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snkch" event={"ID":"6e28ffc9-e728-4dbf-8742-c45081a0df7c","Type":"ContainerDied","Data":"3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037"} Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.052513 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snkch" event={"ID":"6e28ffc9-e728-4dbf-8742-c45081a0df7c","Type":"ContainerDied","Data":"231656b1a462f70e6e6d00e45aa00e6024c2f97de5dfdd62befe6fefe198ebae"} Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.052528 4772 scope.go:117] "RemoveContainer" containerID="3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037" Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.084464 4772 scope.go:117] "RemoveContainer" containerID="39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356" Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.084465 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snkch"] Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.093760 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-snkch"] Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.111461 4772 scope.go:117] "RemoveContainer" containerID="ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d" Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.165493 4772 scope.go:117] "RemoveContainer" containerID="3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037" Sep 30 19:10:00 crc kubenswrapper[4772]: E0930 19:10:00.165910 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037\": container with ID starting with 3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037 not found: ID does not exist" containerID="3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037" Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.165941 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037"} err="failed to get container status \"3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037\": rpc error: code = NotFound desc = could not find container \"3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037\": container with ID starting with 3955145393f596297ca321776547b33bd4cd9a7bd6111768e66819e264d84037 not found: ID does not exist" Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.165961 4772 scope.go:117] "RemoveContainer" containerID="39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356" Sep 30 19:10:00 crc kubenswrapper[4772]: E0930 19:10:00.166491 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356\": container with ID starting with 39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356 not found: ID does not exist" containerID="39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356" Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.166519 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356"} err="failed to get container status \"39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356\": rpc error: code = NotFound desc = could not find container \"39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356\": container with ID starting with 39d49b6964de4b5751392bff30105676f92dd12bf10142aa19e39b45cb7f1356 not found: ID does not exist" Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.166539 4772 scope.go:117] "RemoveContainer" containerID="ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d" Sep 30 19:10:00 crc kubenswrapper[4772]: E0930 19:10:00.166852 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d\": container with ID starting with ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d not found: ID does not exist" containerID="ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d" Sep 30 19:10:00 crc kubenswrapper[4772]: I0930 19:10:00.166875 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d"} err="failed to get container status \"ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d\": rpc error: code = NotFound desc = could not find container \"ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d\": container with ID starting with ce15d253b5ca5b5724fba763a90e5586d088f76e04472d2d821c0db89fcc121d not found: ID does not exist" Sep 30 19:10:01 crc kubenswrapper[4772]: I0930 19:10:01.910573 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e28ffc9-e728-4dbf-8742-c45081a0df7c" path="/var/lib/kubelet/pods/6e28ffc9-e728-4dbf-8742-c45081a0df7c/volumes" Sep 30 19:10:58 crc kubenswrapper[4772]: I0930 19:10:58.725475 4772 generic.go:334] "Generic (PLEG): container finished" podID="8421b437-89eb-482f-a03b-898e483527e1" containerID="6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9" exitCode=0 Sep 30 19:10:58 crc kubenswrapper[4772]: I0930 19:10:58.725571 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs27s/must-gather-v5kw2" event={"ID":"8421b437-89eb-482f-a03b-898e483527e1","Type":"ContainerDied","Data":"6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9"} Sep 30 19:10:58 crc kubenswrapper[4772]: I0930 19:10:58.727314 4772 scope.go:117] "RemoveContainer" containerID="6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9" Sep 30 19:10:59 crc kubenswrapper[4772]: I0930 19:10:59.040284 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rs27s_must-gather-v5kw2_8421b437-89eb-482f-a03b-898e483527e1/gather/0.log" Sep 30 19:11:08 crc kubenswrapper[4772]: I0930 19:11:08.656239 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:11:08 crc kubenswrapper[4772]: I0930 19:11:08.657431 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.042573 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rs27s/must-gather-v5kw2"] Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.043477 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rs27s/must-gather-v5kw2" podUID="8421b437-89eb-482f-a03b-898e483527e1" containerName="copy" containerID="cri-o://2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed" gracePeriod=2 Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.052749 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rs27s/must-gather-v5kw2"] Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.512943 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rs27s_must-gather-v5kw2_8421b437-89eb-482f-a03b-898e483527e1/copy/0.log" Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.513723 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/must-gather-v5kw2" Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.622808 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8421b437-89eb-482f-a03b-898e483527e1-must-gather-output\") pod \"8421b437-89eb-482f-a03b-898e483527e1\" (UID: \"8421b437-89eb-482f-a03b-898e483527e1\") " Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.622952 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhx2t\" (UniqueName: \"kubernetes.io/projected/8421b437-89eb-482f-a03b-898e483527e1-kube-api-access-qhx2t\") pod \"8421b437-89eb-482f-a03b-898e483527e1\" (UID: \"8421b437-89eb-482f-a03b-898e483527e1\") " Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.629292 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8421b437-89eb-482f-a03b-898e483527e1-kube-api-access-qhx2t" (OuterVolumeSpecName: "kube-api-access-qhx2t") pod "8421b437-89eb-482f-a03b-898e483527e1" (UID: "8421b437-89eb-482f-a03b-898e483527e1"). InnerVolumeSpecName "kube-api-access-qhx2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.728538 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhx2t\" (UniqueName: \"kubernetes.io/projected/8421b437-89eb-482f-a03b-898e483527e1-kube-api-access-qhx2t\") on node \"crc\" DevicePath \"\"" Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.839877 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8421b437-89eb-482f-a03b-898e483527e1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8421b437-89eb-482f-a03b-898e483527e1" (UID: "8421b437-89eb-482f-a03b-898e483527e1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.914724 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rs27s_must-gather-v5kw2_8421b437-89eb-482f-a03b-898e483527e1/copy/0.log" Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.915466 4772 generic.go:334] "Generic (PLEG): container finished" podID="8421b437-89eb-482f-a03b-898e483527e1" containerID="2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed" exitCode=143 Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.915528 4772 scope.go:117] "RemoveContainer" containerID="2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed" Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.915557 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs27s/must-gather-v5kw2" Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.936813 4772 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8421b437-89eb-482f-a03b-898e483527e1-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 19:11:14 crc kubenswrapper[4772]: I0930 19:11:14.945247 4772 scope.go:117] "RemoveContainer" containerID="6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9" Sep 30 19:11:15 crc kubenswrapper[4772]: I0930 19:11:15.058199 4772 scope.go:117] "RemoveContainer" containerID="2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed" Sep 30 19:11:15 crc kubenswrapper[4772]: E0930 19:11:15.058768 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed\": container with ID starting with 2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed not found: ID does not exist" containerID="2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed" Sep 30 19:11:15 crc kubenswrapper[4772]: I0930 19:11:15.058826 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed"} err="failed to get container status \"2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed\": rpc error: code = NotFound desc = could not find container \"2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed\": container with ID starting with 2f073dd59b53dde2f53b1ea5b3061c99b25002ab15ea39c3fbe3662c3c7049ed not found: ID does not exist" Sep 30 19:11:15 crc kubenswrapper[4772]: I0930 19:11:15.058864 4772 scope.go:117] "RemoveContainer" containerID="6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9" Sep 30 19:11:15 crc kubenswrapper[4772]: E0930 19:11:15.059251 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9\": container with ID starting with 6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9 not found: ID does not exist" containerID="6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9" Sep 30 19:11:15 crc kubenswrapper[4772]: I0930 19:11:15.059294 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9"} err="failed to get container status \"6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9\": rpc error: code = NotFound desc = could not find container \"6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9\": container with ID starting with 6c626330613a5a7b27caaef45e5de7e4f950b6645a3d2097763cd72b0d7eefd9 not found: ID does not exist" Sep 30 19:11:15 crc kubenswrapper[4772]: I0930 19:11:15.915414 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8421b437-89eb-482f-a03b-898e483527e1" path="/var/lib/kubelet/pods/8421b437-89eb-482f-a03b-898e483527e1/volumes" Sep 30 19:11:38 crc kubenswrapper[4772]: I0930 19:11:38.654818 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:11:38 crc kubenswrapper[4772]: I0930 19:11:38.655911 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:12:08 crc kubenswrapper[4772]: I0930 19:12:08.655675 4772 patch_prober.go:28] interesting pod/machine-config-daemon-rkhll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:12:08 crc kubenswrapper[4772]: I0930 19:12:08.656817 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:12:08 crc kubenswrapper[4772]: I0930 19:12:08.656908 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" Sep 30 19:12:08 crc kubenswrapper[4772]: I0930 19:12:08.658103 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30642c1dccae00e1a6c51a4309271b0abdf33cc7bfdd0e81d95f067af0f029c7"} pod="openshift-machine-config-operator/machine-config-daemon-rkhll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:12:08 crc kubenswrapper[4772]: I0930 19:12:08.658192 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" podUID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerName="machine-config-daemon" containerID="cri-o://30642c1dccae00e1a6c51a4309271b0abdf33cc7bfdd0e81d95f067af0f029c7" gracePeriod=600 Sep 30 19:12:09 crc kubenswrapper[4772]: I0930 19:12:09.528821 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e885147-8bd5-4c7a-9331-ec1f4eebd3f7" containerID="30642c1dccae00e1a6c51a4309271b0abdf33cc7bfdd0e81d95f067af0f029c7" exitCode=0 Sep 30 19:12:09 crc kubenswrapper[4772]: I0930 19:12:09.528910 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerDied","Data":"30642c1dccae00e1a6c51a4309271b0abdf33cc7bfdd0e81d95f067af0f029c7"} Sep 30 19:12:09 crc kubenswrapper[4772]: I0930 19:12:09.529377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rkhll" event={"ID":"8e885147-8bd5-4c7a-9331-ec1f4eebd3f7","Type":"ContainerStarted","Data":"d56b61bbad432f52fe08cea81b24ab6ae0c1407d815e83d55e7cfeeed2f253c2"} Sep 30 19:12:09 crc kubenswrapper[4772]: I0930 19:12:09.529401 4772 scope.go:117] "RemoveContainer" containerID="fabf03c90525f2169223a0f9282625f98fa468ad550333649f415c5455056203"